Provisioning¶
These dataclasses are used in the SDK to represent API requests and responses for services in the databricks.sdk.service.provisioning
module.
- class databricks.sdk.service.provisioning.AwsCredentials¶
-
- as_dict() dict ¶
Serializes the AwsCredentials into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the AwsCredentials into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) AwsCredentials ¶
Deserializes the AwsCredentials from a dictionary.
- class databricks.sdk.service.provisioning.AwsKeyInfo¶
- key_arn: str¶
The AWS KMS key’s Amazon Resource Name (ARN).
- key_region: str¶
The AWS KMS key region.
- key_alias: str | None = None¶
The AWS KMS key alias.
- reuse_key_for_cluster_volumes: bool | None = None¶
This field applies only if the use_cases property includes STORAGE. If this is set to true or omitted, the key is also used to encrypt cluster EBS volumes. If you do not want to use this key for encrypting EBS volumes, set to false.
- as_dict() dict ¶
Serializes the AwsKeyInfo into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the AwsKeyInfo into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) AwsKeyInfo ¶
Deserializes the AwsKeyInfo from a dictionary.
- class databricks.sdk.service.provisioning.AzureWorkspaceInfo¶
- resource_group: str | None = None¶
Azure Resource Group name
- subscription_id: str | None = None¶
Azure Subscription ID
- as_dict() dict ¶
Serializes the AzureWorkspaceInfo into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the AzureWorkspaceInfo into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) AzureWorkspaceInfo ¶
Deserializes the AzureWorkspaceInfo from a dictionary.
- class databricks.sdk.service.provisioning.CloudResourceContainer¶
The general workspace configurations that are specific to cloud providers.
- gcp: CustomerFacingGcpCloudResourceContainer | None = None¶
The general workspace configurations that are specific to Google Cloud.
- as_dict() dict ¶
Serializes the CloudResourceContainer into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the CloudResourceContainer into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) CloudResourceContainer ¶
Deserializes the CloudResourceContainer from a dictionary.
- class databricks.sdk.service.provisioning.CreateAwsKeyInfo¶
- key_arn: str¶
The AWS KMS key’s Amazon Resource Name (ARN). Note that the key’s AWS region is inferred from the ARN.
- key_alias: str | None = None¶
The AWS KMS key alias.
- reuse_key_for_cluster_volumes: bool | None = None¶
This field applies only if the use_cases property includes STORAGE. If this is set to true or omitted, the key is also used to encrypt cluster EBS volumes. To not use this key also for encrypting EBS volumes, set this to false.
- as_dict() dict ¶
Serializes the CreateAwsKeyInfo into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the CreateAwsKeyInfo into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) CreateAwsKeyInfo ¶
Deserializes the CreateAwsKeyInfo from a dictionary.
- class databricks.sdk.service.provisioning.CreateCredentialAwsCredentials¶
- sts_role: CreateCredentialStsRole | None = None¶
- as_dict() dict ¶
Serializes the CreateCredentialAwsCredentials into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the CreateCredentialAwsCredentials into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) CreateCredentialAwsCredentials ¶
Deserializes the CreateCredentialAwsCredentials from a dictionary.
- class databricks.sdk.service.provisioning.CreateCredentialRequest¶
- credentials_name: str¶
The human-readable name of the credential configuration object.
- aws_credentials: CreateCredentialAwsCredentials¶
- as_dict() dict ¶
Serializes the CreateCredentialRequest into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the CreateCredentialRequest into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) CreateCredentialRequest ¶
Deserializes the CreateCredentialRequest from a dictionary.
- class databricks.sdk.service.provisioning.CreateCredentialStsRole¶
- role_arn: str | None = None¶
The Amazon Resource Name (ARN) of the cross account role.
- as_dict() dict ¶
Serializes the CreateCredentialStsRole into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the CreateCredentialStsRole into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) CreateCredentialStsRole ¶
Deserializes the CreateCredentialStsRole from a dictionary.
- class databricks.sdk.service.provisioning.CreateCustomerManagedKeyRequest¶
- use_cases: List[KeyUseCase]¶
The cases that the key can be used for.
- aws_key_info: CreateAwsKeyInfo | None = None¶
- gcp_key_info: CreateGcpKeyInfo | None = None¶
- as_dict() dict ¶
Serializes the CreateCustomerManagedKeyRequest into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the CreateCustomerManagedKeyRequest into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) CreateCustomerManagedKeyRequest ¶
Deserializes the CreateCustomerManagedKeyRequest from a dictionary.
- class databricks.sdk.service.provisioning.CreateGcpKeyInfo¶
- kms_key_id: str¶
The GCP KMS key’s resource name
- as_dict() dict ¶
Serializes the CreateGcpKeyInfo into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the CreateGcpKeyInfo into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) CreateGcpKeyInfo ¶
Deserializes the CreateGcpKeyInfo from a dictionary.
- class databricks.sdk.service.provisioning.CreateNetworkRequest¶
- network_name: str¶
The human-readable name of the network configuration.
- gcp_network_info: GcpNetworkInfo | None = None¶
The Google Cloud specific information for this network (for example, the VPC ID, subnet ID, and secondary IP ranges).
- security_group_ids: List[str] | None = None¶
IDs of one to five security groups associated with this network. Security group IDs cannot be used in multiple network configurations.
- subnet_ids: List[str] | None = None¶
IDs of at least two subnets associated with this network. Subnet IDs cannot be used in multiple network configurations.
- vpc_endpoints: NetworkVpcEndpoints | None = None¶
If specified, contains the VPC endpoints used to allow cluster communication from this VPC over [AWS PrivateLink].
[AWS PrivateLink]: https://aws.amazon.com/privatelink/
- vpc_id: str | None = None¶
The ID of the VPC associated with this network. VPC IDs can be used in multiple network configurations.
- as_dict() dict ¶
Serializes the CreateNetworkRequest into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the CreateNetworkRequest into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) CreateNetworkRequest ¶
Deserializes the CreateNetworkRequest from a dictionary.
- class databricks.sdk.service.provisioning.CreateStorageConfigurationRequest¶
- storage_configuration_name: str¶
The human-readable name of the storage configuration.
- root_bucket_info: RootBucketInfo¶
Root S3 bucket information.
- as_dict() dict ¶
Serializes the CreateStorageConfigurationRequest into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the CreateStorageConfigurationRequest into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) CreateStorageConfigurationRequest ¶
Deserializes the CreateStorageConfigurationRequest from a dictionary.
- class databricks.sdk.service.provisioning.CreateVpcEndpointRequest¶
- vpc_endpoint_name: str¶
The human-readable name of the storage configuration.
- aws_vpc_endpoint_id: str | None = None¶
The ID of the VPC endpoint object in AWS.
- gcp_vpc_endpoint_info: GcpVpcEndpointInfo | None = None¶
The Google Cloud specific information for this Private Service Connect endpoint.
- region: str | None = None¶
The AWS region in which this VPC endpoint object exists.
- as_dict() dict ¶
Serializes the CreateVpcEndpointRequest into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the CreateVpcEndpointRequest into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) CreateVpcEndpointRequest ¶
Deserializes the CreateVpcEndpointRequest from a dictionary.
- class databricks.sdk.service.provisioning.CreateWorkspaceRequest¶
- workspace_name: str¶
The workspace’s human-readable name.
- aws_region: str | None = None¶
The AWS region of the workspace’s data plane.
- cloud: str | None = None¶
The cloud provider which the workspace uses. For Google Cloud workspaces, always set this field to gcp.
- cloud_resource_container: CloudResourceContainer | None = None¶
The general workspace configurations that are specific to cloud providers.
- credentials_id: str | None = None¶
ID of the workspace’s credential configuration object.
- custom_tags: Dict[str, str] | None = None¶
The custom tags key-value pairing that is attached to this workspace. The key-value pair is a string of utf-8 characters. The value can be an empty string, with maximum length of 255 characters. The key can be of maximum length of 127 characters, and cannot be empty.
- deployment_name: str | None = None¶
The deployment name defines part of the subdomain for the workspace. The workspace URL for the web application and REST APIs is <workspace-deployment-name>.cloud.databricks.com. For example, if the deployment name is abcsales, your workspace URL will be https://abcsales.cloud.databricks.com. Hyphens are allowed. This property supports only the set of characters that are allowed in a subdomain.
To set this value, you must have a deployment name prefix. Contact your Databricks account team to add an account deployment name prefix to your account.
Workspace deployment names follow the account prefix and a hyphen. For example, if your account’s deployment prefix is acme and the workspace deployment name is workspace-1, the JSON response for the deployment_name field becomes acme-workspace-1. The workspace URL would be acme-workspace-1.cloud.databricks.com.
You can also set the deployment_name to the reserved keyword EMPTY if you want the deployment name to only include the deployment prefix. For example, if your account’s deployment prefix is acme and the workspace deployment name is EMPTY, the deployment_name becomes acme only and the workspace URL is acme.cloud.databricks.com.
This value must be unique across all non-deleted deployments across all AWS regions.
If a new workspace omits this property, the server generates a unique deployment name for you with the pattern dbc-xxxxxxxx-xxxx.
- gcp_managed_network_config: GcpManagedNetworkConfig | None = None¶
The network settings for the workspace. The configurations are only for Databricks-managed VPCs. It is ignored if you specify a customer-managed VPC in the network_id field.”, All the IP range configurations must be mutually exclusive. An attempt to create a workspace fails if Databricks detects an IP range overlap.
Specify custom IP ranges in CIDR format. The IP ranges for these fields must not overlap, and all IP addresses must be entirely within the following ranges: 10.0.0.0/8, 100.64.0.0/10, 172.16.0.0/12, 192.168.0.0/16, and 240.0.0.0/4.
The sizes of these IP ranges affect the maximum number of nodes for the workspace.
Important: Confirm the IP ranges used by your Databricks workspace before creating the workspace. You cannot change them after your workspace is deployed. If the IP address ranges for your Databricks are too small, IP exhaustion can occur, causing your Databricks jobs to fail. To determine the address range sizes that you need, Databricks provides a calculator as a Microsoft Excel spreadsheet. See [calculate subnet sizes for a new workspace].
[calculate subnet sizes for a new workspace]: https://docs.gcp.databricks.com/administration-guide/cloud-configurations/gcp/network-sizing.html
- gke_config: GkeConfig | None = None¶
The configurations for the GKE cluster of a Databricks workspace.
- is_no_public_ip_enabled: bool | None = None¶
Whether no public IP is enabled for the workspace.
- location: str | None = None¶
The Google Cloud region of the workspace data plane in your Google account. For example, us-east4.
- managed_services_customer_managed_key_id: str | None = None¶
The ID of the workspace’s managed services encryption key configuration object. This is used to help protect and control access to the workspace’s notebooks, secrets, Databricks SQL queries, and query history. The provided key configuration object property use_cases must contain MANAGED_SERVICES.
- network_id: str | None = None¶
- pricing_tier: PricingTier | None = None¶
The pricing tier of the workspace. For pricing tier information, see [AWS Pricing].
[AWS Pricing]: https://databricks.com/product/aws-pricing
- private_access_settings_id: str | None = None¶
ID of the workspace’s private access settings object. Only used for PrivateLink. This ID must be specified for customers using [AWS PrivateLink] for either front-end (user-to-workspace connection), back-end (data plane to control plane connection), or both connection types.
Before configuring PrivateLink, read the [Databricks article about PrivateLink].”,
[AWS PrivateLink]: https://aws.amazon.com/privatelink/ [Databricks article about PrivateLink]: https://docs.databricks.com/administration-guide/cloud-configurations/aws/privatelink.html
- storage_configuration_id: str | None = None¶
The ID of the workspace’s storage configuration object.
- storage_customer_managed_key_id: str | None = None¶
The ID of the workspace’s storage encryption key configuration object. This is used to encrypt the workspace’s root S3 bucket (root DBFS and system data) and, optionally, cluster EBS volumes. The provided key configuration object property use_cases must contain STORAGE.
- as_dict() dict ¶
Serializes the CreateWorkspaceRequest into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the CreateWorkspaceRequest into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) CreateWorkspaceRequest ¶
Deserializes the CreateWorkspaceRequest from a dictionary.
- class databricks.sdk.service.provisioning.Credential¶
- account_id: str | None = None¶
The Databricks account ID that hosts the credential.
- aws_credentials: AwsCredentials | None = None¶
- creation_time: int | None = None¶
Time in epoch milliseconds when the credential was created.
- credentials_id: str | None = None¶
Databricks credential configuration ID.
- credentials_name: str | None = None¶
The human-readable name of the credential configuration object.
- as_dict() dict ¶
Serializes the Credential into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the Credential into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) Credential ¶
Deserializes the Credential from a dictionary.
- class databricks.sdk.service.provisioning.CustomerFacingGcpCloudResourceContainer¶
The general workspace configurations that are specific to Google Cloud.
- project_id: str | None = None¶
The Google Cloud project ID, which the workspace uses to instantiate cloud resources for your workspace.
- as_dict() dict ¶
Serializes the CustomerFacingGcpCloudResourceContainer into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the CustomerFacingGcpCloudResourceContainer into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) CustomerFacingGcpCloudResourceContainer ¶
Deserializes the CustomerFacingGcpCloudResourceContainer from a dictionary.
- class databricks.sdk.service.provisioning.CustomerManagedKey¶
- account_id: str | None = None¶
The Databricks account ID that holds the customer-managed key.
- aws_key_info: AwsKeyInfo | None = None¶
- creation_time: int | None = None¶
Time in epoch milliseconds when the customer key was created.
- customer_managed_key_id: str | None = None¶
ID of the encryption key configuration object.
- gcp_key_info: GcpKeyInfo | None = None¶
- use_cases: List[KeyUseCase] | None = None¶
The cases that the key can be used for.
- as_dict() dict ¶
Serializes the CustomerManagedKey into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the CustomerManagedKey into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) CustomerManagedKey ¶
Deserializes the CustomerManagedKey from a dictionary.
- class databricks.sdk.service.provisioning.DeleteResponse¶
- as_dict() dict ¶
Serializes the DeleteResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the DeleteResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) DeleteResponse ¶
Deserializes the DeleteResponse from a dictionary.
- class databricks.sdk.service.provisioning.EndpointUseCase¶
This enumeration represents the type of Databricks VPC [endpoint service] that was used when creating this VPC endpoint. [endpoint service]: https://docs.aws.amazon.com/vpc/latest/privatelink/endpoint-service.html
- DATAPLANE_RELAY_ACCESS = "DATAPLANE_RELAY_ACCESS"¶
- WORKSPACE_ACCESS = "WORKSPACE_ACCESS"¶
- class databricks.sdk.service.provisioning.ErrorType¶
The AWS resource associated with this error: credentials, VPC, subnet, security group, or network ACL.
- CREDENTIALS = "CREDENTIALS"¶
- NETWORK_ACL = "NETWORK_ACL"¶
- SECURITY_GROUP = "SECURITY_GROUP"¶
- SUBNET = "SUBNET"¶
- VPC = "VPC"¶
- class databricks.sdk.service.provisioning.ExternalCustomerInfo¶
- authoritative_user_email: str | None = None¶
Email of the authoritative user.
- authoritative_user_full_name: str | None = None¶
The authoritative user full name.
- customer_name: str | None = None¶
The legal entity name for the external workspace
- as_dict() dict ¶
Serializes the ExternalCustomerInfo into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the ExternalCustomerInfo into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) ExternalCustomerInfo ¶
Deserializes the ExternalCustomerInfo from a dictionary.
- class databricks.sdk.service.provisioning.GcpKeyInfo¶
- kms_key_id: str¶
The GCP KMS key’s resource name
- as_dict() dict ¶
Serializes the GcpKeyInfo into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the GcpKeyInfo into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) GcpKeyInfo ¶
Deserializes the GcpKeyInfo from a dictionary.
- class databricks.sdk.service.provisioning.GcpManagedNetworkConfig¶
The network settings for the workspace. The configurations are only for Databricks-managed VPCs. It is ignored if you specify a customer-managed VPC in the network_id field.”, All the IP range configurations must be mutually exclusive. An attempt to create a workspace fails if Databricks detects an IP range overlap.
Specify custom IP ranges in CIDR format. The IP ranges for these fields must not overlap, and all IP addresses must be entirely within the following ranges: 10.0.0.0/8, 100.64.0.0/10, 172.16.0.0/12, 192.168.0.0/16, and 240.0.0.0/4.
The sizes of these IP ranges affect the maximum number of nodes for the workspace.
Important: Confirm the IP ranges used by your Databricks workspace before creating the workspace. You cannot change them after your workspace is deployed. If the IP address ranges for your Databricks are too small, IP exhaustion can occur, causing your Databricks jobs to fail. To determine the address range sizes that you need, Databricks provides a calculator as a Microsoft Excel spreadsheet. See [calculate subnet sizes for a new workspace].
[calculate subnet sizes for a new workspace]: https://docs.gcp.databricks.com/administration-guide/cloud-configurations/gcp/network-sizing.html
- gke_cluster_pod_ip_range: str | None = None¶
The IP range from which to allocate GKE cluster pods. No bigger than /9 and no smaller than /21.
- gke_cluster_service_ip_range: str | None = None¶
The IP range from which to allocate GKE cluster services. No bigger than /16 and no smaller than /27.
- subnet_cidr: str | None = None¶
The IP range from which to allocate GKE cluster nodes. No bigger than /9 and no smaller than /29.
- as_dict() dict ¶
Serializes the GcpManagedNetworkConfig into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the GcpManagedNetworkConfig into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) GcpManagedNetworkConfig ¶
Deserializes the GcpManagedNetworkConfig from a dictionary.
- class databricks.sdk.service.provisioning.GcpNetworkInfo¶
The Google Cloud specific information for this network (for example, the VPC ID, subnet ID, and secondary IP ranges).
- network_project_id: str¶
The Google Cloud project ID of the VPC network.
- vpc_id: str¶
The ID of the VPC associated with this network. VPC IDs can be used in multiple network configurations.
- subnet_id: str¶
The ID of the subnet associated with this network.
- subnet_region: str¶
The Google Cloud region of the workspace data plane (for example, us-east4).
- pod_ip_range_name: str¶
The name of the secondary IP range for pods. A Databricks-managed GKE cluster uses this IP range for its pods. This secondary IP range can be used by only one workspace.
- service_ip_range_name: str¶
The name of the secondary IP range for services. A Databricks-managed GKE cluster uses this IP range for its services. This secondary IP range can be used by only one workspace.
- as_dict() dict ¶
Serializes the GcpNetworkInfo into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the GcpNetworkInfo into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) GcpNetworkInfo ¶
Deserializes the GcpNetworkInfo from a dictionary.
- class databricks.sdk.service.provisioning.GcpVpcEndpointInfo¶
The Google Cloud specific information for this Private Service Connect endpoint.
- project_id: str¶
The Google Cloud project ID of the VPC network where the PSC connection resides.
- psc_endpoint_name: str¶
The name of the PSC endpoint in the Google Cloud project.
- endpoint_region: str¶
Region of the PSC endpoint.
- psc_connection_id: str | None = None¶
The unique ID of this PSC connection.
- service_attachment_id: str | None = None¶
The service attachment this PSC connection connects to.
- as_dict() dict ¶
Serializes the GcpVpcEndpointInfo into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the GcpVpcEndpointInfo into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) GcpVpcEndpointInfo ¶
Deserializes the GcpVpcEndpointInfo from a dictionary.
- class databricks.sdk.service.provisioning.GkeConfig¶
The configurations for the GKE cluster of a Databricks workspace.
- connectivity_type: GkeConfigConnectivityType | None = None¶
Specifies the network connectivity types for the GKE nodes and the GKE master network.
Set to PRIVATE_NODE_PUBLIC_MASTER for a private GKE cluster for the workspace. The GKE nodes will not have public IPs.
Set to PUBLIC_NODE_PUBLIC_MASTER for a public GKE cluster. The nodes of a public GKE cluster have public IP addresses.
- master_ip_range: str | None = None¶
The IP range from which to allocate GKE cluster master resources. This field will be ignored if GKE private cluster is not enabled.
It must be exactly as big as /28.
- as_dict() dict ¶
Serializes the GkeConfig into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the GkeConfig into a shallow dictionary of its immediate attributes.
- class databricks.sdk.service.provisioning.GkeConfigConnectivityType¶
Specifies the network connectivity types for the GKE nodes and the GKE master network. Set to PRIVATE_NODE_PUBLIC_MASTER for a private GKE cluster for the workspace. The GKE nodes will not have public IPs. Set to PUBLIC_NODE_PUBLIC_MASTER for a public GKE cluster. The nodes of a public GKE cluster have public IP addresses.
- PRIVATE_NODE_PUBLIC_MASTER = "PRIVATE_NODE_PUBLIC_MASTER"¶
- PUBLIC_NODE_PUBLIC_MASTER = "PUBLIC_NODE_PUBLIC_MASTER"¶
- class databricks.sdk.service.provisioning.KeyUseCase¶
Possible values are: * MANAGED_SERVICES: Encrypts notebook and secret data in the control plane * STORAGE: Encrypts the workspace’s root S3 bucket (root DBFS and system data) and, optionally, cluster EBS volumes.
- MANAGED_SERVICES = "MANAGED_SERVICES"¶
- STORAGE = "STORAGE"¶
- class databricks.sdk.service.provisioning.Network¶
- account_id: str | None = None¶
The Databricks account ID associated with this network configuration.
- creation_time: int | None = None¶
Time in epoch milliseconds when the network was created.
- error_messages: List[NetworkHealth] | None = None¶
Array of error messages about the network configuration.
- gcp_network_info: GcpNetworkInfo | None = None¶
The Google Cloud specific information for this network (for example, the VPC ID, subnet ID, and secondary IP ranges).
- network_id: str | None = None¶
The Databricks network configuration ID.
- network_name: str | None = None¶
The human-readable name of the network configuration.
- security_group_ids: List[str] | None = None¶
- subnet_ids: List[str] | None = None¶
- vpc_endpoints: NetworkVpcEndpoints | None = None¶
If specified, contains the VPC endpoints used to allow cluster communication from this VPC over [AWS PrivateLink].
[AWS PrivateLink]: https://aws.amazon.com/privatelink/
- vpc_id: str | None = None¶
The ID of the VPC associated with this network configuration. VPC IDs can be used in multiple networks.
- vpc_status: VpcStatus | None = None¶
The status of this network configuration object in terms of its use in a workspace: * UNATTACHED: Unattached. * VALID: Valid. * BROKEN: Broken. * WARNED: Warned.
- warning_messages: List[NetworkWarning] | None = None¶
Array of warning messages about the network configuration.
- workspace_id: int | None = None¶
Workspace ID associated with this network configuration.
- as_dict() dict ¶
Serializes the Network into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the Network into a shallow dictionary of its immediate attributes.
- class databricks.sdk.service.provisioning.NetworkHealth¶
- error_message: str | None = None¶
Details of the error.
- error_type: ErrorType | None = None¶
The AWS resource associated with this error: credentials, VPC, subnet, security group, or network ACL.
- as_dict() dict ¶
Serializes the NetworkHealth into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the NetworkHealth into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) NetworkHealth ¶
Deserializes the NetworkHealth from a dictionary.
- class databricks.sdk.service.provisioning.NetworkVpcEndpoints¶
If specified, contains the VPC endpoints used to allow cluster communication from this VPC over [AWS PrivateLink].
[AWS PrivateLink]: https://aws.amazon.com/privatelink/
- rest_api: List[str]¶
The VPC endpoint ID used by this network to access the Databricks REST API.
- dataplane_relay: List[str]¶
The VPC endpoint ID used by this network to access the Databricks secure cluster connectivity relay.
- as_dict() dict ¶
Serializes the NetworkVpcEndpoints into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the NetworkVpcEndpoints into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) NetworkVpcEndpoints ¶
Deserializes the NetworkVpcEndpoints from a dictionary.
- class databricks.sdk.service.provisioning.NetworkWarning¶
- warning_message: str | None = None¶
Details of the warning.
- warning_type: WarningType | None = None¶
The AWS resource associated with this warning: a subnet or a security group.
- as_dict() dict ¶
Serializes the NetworkWarning into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the NetworkWarning into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) NetworkWarning ¶
Deserializes the NetworkWarning from a dictionary.
- class databricks.sdk.service.provisioning.PricingTier¶
The pricing tier of the workspace. For pricing tier information, see [AWS Pricing]. [AWS Pricing]: https://databricks.com/product/aws-pricing
- COMMUNITY_EDITION = "COMMUNITY_EDITION"¶
- DEDICATED = "DEDICATED"¶
- ENTERPRISE = "ENTERPRISE"¶
- PREMIUM = "PREMIUM"¶
- STANDARD = "STANDARD"¶
- UNKNOWN = "UNKNOWN"¶
- class databricks.sdk.service.provisioning.PrivateAccessLevel¶
The private access level controls which VPC endpoints can connect to the UI or API of any workspace that attaches this private access settings object. * ACCOUNT level access (the default) allows only VPC endpoints that are registered in your Databricks account connect to your workspace. * ENDPOINT level access allows only specified VPC endpoints connect to your workspace. For details, see allowed_vpc_endpoint_ids.
- ACCOUNT = "ACCOUNT"¶
- ENDPOINT = "ENDPOINT"¶
- class databricks.sdk.service.provisioning.PrivateAccessSettings¶
- account_id: str | None = None¶
The Databricks account ID that hosts the credential.
- allowed_vpc_endpoint_ids: List[str] | None = None¶
An array of Databricks VPC endpoint IDs.
- private_access_level: PrivateAccessLevel | None = None¶
The private access level controls which VPC endpoints can connect to the UI or API of any workspace that attaches this private access settings object. * ACCOUNT level access (the default) allows only VPC endpoints that are registered in your Databricks account connect to your workspace. * ENDPOINT level access allows only specified VPC endpoints connect to your workspace. For details, see allowed_vpc_endpoint_ids.
- private_access_settings_id: str | None = None¶
Databricks private access settings ID.
- private_access_settings_name: str | None = None¶
The human-readable name of the private access settings object.
- public_access_enabled: bool | None = None¶
Determines if the workspace can be accessed over public internet. For fully private workspaces, you can optionally specify false, but only if you implement both the front-end and the back-end PrivateLink connections. Otherwise, specify true, which means that public access is enabled.
- region: str | None = None¶
The cloud region for workspaces attached to this private access settings object.
- as_dict() dict ¶
Serializes the PrivateAccessSettings into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the PrivateAccessSettings into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) PrivateAccessSettings ¶
Deserializes the PrivateAccessSettings from a dictionary.
- class databricks.sdk.service.provisioning.ReplaceResponse¶
- as_dict() dict ¶
Serializes the ReplaceResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the ReplaceResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) ReplaceResponse ¶
Deserializes the ReplaceResponse from a dictionary.
- class databricks.sdk.service.provisioning.RootBucketInfo¶
Root S3 bucket information.
- bucket_name: str | None = None¶
The name of the S3 bucket.
- as_dict() dict ¶
Serializes the RootBucketInfo into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the RootBucketInfo into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) RootBucketInfo ¶
Deserializes the RootBucketInfo from a dictionary.
- class databricks.sdk.service.provisioning.StorageConfiguration¶
- account_id: str | None = None¶
The Databricks account ID that hosts the credential.
- creation_time: int | None = None¶
Time in epoch milliseconds when the storage configuration was created.
- root_bucket_info: RootBucketInfo | None = None¶
Root S3 bucket information.
- storage_configuration_id: str | None = None¶
Databricks storage configuration ID.
- storage_configuration_name: str | None = None¶
The human-readable name of the storage configuration.
- as_dict() dict ¶
Serializes the StorageConfiguration into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the StorageConfiguration into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) StorageConfiguration ¶
Deserializes the StorageConfiguration from a dictionary.
- class databricks.sdk.service.provisioning.StsRole¶
- external_id: str | None = None¶
The external ID that needs to be trusted by the cross-account role. This is always your Databricks account ID.
- role_arn: str | None = None¶
The Amazon Resource Name (ARN) of the cross account role.
- as_dict() dict ¶
Serializes the StsRole into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the StsRole into a shallow dictionary of its immediate attributes.
- class databricks.sdk.service.provisioning.UpdateResponse¶
- as_dict() dict ¶
Serializes the UpdateResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the UpdateResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) UpdateResponse ¶
Deserializes the UpdateResponse from a dictionary.
- class databricks.sdk.service.provisioning.UpdateWorkspaceRequest¶
- aws_region: str | None = None¶
The AWS region of the workspace’s data plane (for example, us-west-2). This parameter is available only for updating failed workspaces.
- credentials_id: str | None = None¶
ID of the workspace’s credential configuration object. This parameter is available for updating both failed and running workspaces.
- custom_tags: Dict[str, str] | None = None¶
The custom tags key-value pairing that is attached to this workspace. The key-value pair is a string of utf-8 characters. The value can be an empty string, with maximum length of 255 characters. The key can be of maximum length of 127 characters, and cannot be empty.
- managed_services_customer_managed_key_id: str | None = None¶
The ID of the workspace’s managed services encryption key configuration object. This parameter is available only for updating failed workspaces.
- network_connectivity_config_id: str | None = None¶
- network_id: str | None = None¶
The ID of the workspace’s network configuration object. Used only if you already use a customer-managed VPC. For failed workspaces only, you can switch from a Databricks-managed VPC to a customer-managed VPC by updating the workspace to add a network configuration ID.
- private_access_settings_id: str | None = None¶
The ID of the workspace’s private access settings configuration object. This parameter is available only for updating failed workspaces.
- storage_configuration_id: str | None = None¶
The ID of the workspace’s storage configuration object. This parameter is available only for updating failed workspaces.
- storage_customer_managed_key_id: str | None = None¶
The ID of the key configuration object for workspace storage. This parameter is available for updating both failed and running workspaces.
- workspace_id: int | None = None¶
Workspace ID.
- as_dict() dict ¶
Serializes the UpdateWorkspaceRequest into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the UpdateWorkspaceRequest into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) UpdateWorkspaceRequest ¶
Deserializes the UpdateWorkspaceRequest from a dictionary.
- class databricks.sdk.service.provisioning.UpsertPrivateAccessSettingsRequest¶
- private_access_settings_name: str¶
The human-readable name of the private access settings object.
- region: str¶
The cloud region for workspaces associated with this private access settings object.
- allowed_vpc_endpoint_ids: List[str] | None = None¶
An array of Databricks VPC endpoint IDs. This is the Databricks ID that is returned when registering the VPC endpoint configuration in your Databricks account. This is not the ID of the VPC endpoint in AWS.
Only used when private_access_level is set to ENDPOINT. This is an allow list of VPC endpoints that in your account that can connect to your workspace over AWS PrivateLink.
If hybrid access to your workspace is enabled by setting public_access_enabled to true, this control only works for PrivateLink connections. To control how your workspace is accessed via public internet, see [IP access lists].
[IP access lists]: https://docs.databricks.com/security/network/ip-access-list.html
- private_access_level: PrivateAccessLevel | None = None¶
The private access level controls which VPC endpoints can connect to the UI or API of any workspace that attaches this private access settings object. * ACCOUNT level access (the default) allows only VPC endpoints that are registered in your Databricks account connect to your workspace. * ENDPOINT level access allows only specified VPC endpoints connect to your workspace. For details, see allowed_vpc_endpoint_ids.
- private_access_settings_id: str | None = None¶
Databricks Account API private access settings ID.
- public_access_enabled: bool | None = None¶
Determines if the workspace can be accessed over public internet. For fully private workspaces, you can optionally specify false, but only if you implement both the front-end and the back-end PrivateLink connections. Otherwise, specify true, which means that public access is enabled.
- as_dict() dict ¶
Serializes the UpsertPrivateAccessSettingsRequest into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the UpsertPrivateAccessSettingsRequest into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) UpsertPrivateAccessSettingsRequest ¶
Deserializes the UpsertPrivateAccessSettingsRequest from a dictionary.
- class databricks.sdk.service.provisioning.VpcEndpoint¶
- account_id: str | None = None¶
The Databricks account ID that hosts the VPC endpoint configuration.
- aws_account_id: str | None = None¶
The AWS Account in which the VPC endpoint object exists.
- aws_endpoint_service_id: str | None = None¶
The ID of the Databricks [endpoint service] that this VPC endpoint is connected to. For a list of endpoint service IDs for each supported AWS region, see the [Databricks PrivateLink documentation].
[Databricks PrivateLink documentation]: https://docs.databricks.com/administration-guide/cloud-configurations/aws/privatelink.html [endpoint service]: https://docs.aws.amazon.com/vpc/latest/privatelink/endpoint-service.html
- aws_vpc_endpoint_id: str | None = None¶
The ID of the VPC endpoint object in AWS.
- gcp_vpc_endpoint_info: GcpVpcEndpointInfo | None = None¶
The Google Cloud specific information for this Private Service Connect endpoint.
- region: str | None = None¶
The AWS region in which this VPC endpoint object exists.
- state: str | None = None¶
The current state (such as available or rejected) of the VPC endpoint. Derived from AWS. For the full set of values, see [AWS DescribeVpcEndpoint documentation].
[AWS DescribeVpcEndpoint documentation]: https://docs.aws.amazon.com/cli/latest/reference/ec2/describe-vpc-endpoints.html
- use_case: EndpointUseCase | None = None¶
This enumeration represents the type of Databricks VPC [endpoint service] that was used when creating this VPC endpoint.
[endpoint service]: https://docs.aws.amazon.com/vpc/latest/privatelink/endpoint-service.html
- vpc_endpoint_id: str | None = None¶
Databricks VPC endpoint ID. This is the Databricks-specific name of the VPC endpoint. Do not confuse this with the aws_vpc_endpoint_id, which is the ID within AWS of the VPC endpoint.
- vpc_endpoint_name: str | None = None¶
The human-readable name of the storage configuration.
- as_dict() dict ¶
Serializes the VpcEndpoint into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the VpcEndpoint into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) VpcEndpoint ¶
Deserializes the VpcEndpoint from a dictionary.
- class databricks.sdk.service.provisioning.VpcStatus¶
The status of this network configuration object in terms of its use in a workspace: * UNATTACHED: Unattached. * VALID: Valid. * BROKEN: Broken. * WARNED: Warned.
- BROKEN = "BROKEN"¶
- UNATTACHED = "UNATTACHED"¶
- VALID = "VALID"¶
- WARNED = "WARNED"¶
- class databricks.sdk.service.provisioning.WarningType¶
The AWS resource associated with this warning: a subnet or a security group.
- SECURITY_GROUP = "SECURITY_GROUP"¶
- SUBNET = "SUBNET"¶
- class databricks.sdk.service.provisioning.Workspace¶
- account_id: str | None = None¶
Databricks account ID.
- aws_region: str | None = None¶
The AWS region of the workspace data plane (for example, us-west-2).
- azure_workspace_info: AzureWorkspaceInfo | None = None¶
- cloud: str | None = None¶
The cloud name. This field always has the value gcp.
- cloud_resource_container: CloudResourceContainer | None = None¶
The general workspace configurations that are specific to cloud providers.
- creation_time: int | None = None¶
Time in epoch milliseconds when the workspace was created.
- credentials_id: str | None = None¶
ID of the workspace’s credential configuration object.
- custom_tags: Dict[str, str] | None = None¶
The custom tags key-value pairing that is attached to this workspace. The key-value pair is a string of utf-8 characters. The value can be an empty string, with maximum length of 255 characters. The key can be of maximum length of 127 characters, and cannot be empty.
- deployment_name: str | None = None¶
The deployment name defines part of the subdomain for the workspace. The workspace URL for web application and REST APIs is <deployment-name>.cloud.databricks.com.
This value must be unique across all non-deleted deployments across all AWS regions.
- external_customer_info: ExternalCustomerInfo | None = None¶
If this workspace is for a external customer, then external_customer_info is populated. If this workspace is not for a external customer, then external_customer_info is empty.
- gcp_managed_network_config: GcpManagedNetworkConfig | None = None¶
The network settings for the workspace. The configurations are only for Databricks-managed VPCs. It is ignored if you specify a customer-managed VPC in the network_id field.”, All the IP range configurations must be mutually exclusive. An attempt to create a workspace fails if Databricks detects an IP range overlap.
Specify custom IP ranges in CIDR format. The IP ranges for these fields must not overlap, and all IP addresses must be entirely within the following ranges: 10.0.0.0/8, 100.64.0.0/10, 172.16.0.0/12, 192.168.0.0/16, and 240.0.0.0/4.
The sizes of these IP ranges affect the maximum number of nodes for the workspace.
Important: Confirm the IP ranges used by your Databricks workspace before creating the workspace. You cannot change them after your workspace is deployed. If the IP address ranges for your Databricks are too small, IP exhaustion can occur, causing your Databricks jobs to fail. To determine the address range sizes that you need, Databricks provides a calculator as a Microsoft Excel spreadsheet. See [calculate subnet sizes for a new workspace].
[calculate subnet sizes for a new workspace]: https://docs.gcp.databricks.com/administration-guide/cloud-configurations/gcp/network-sizing.html
- gke_config: GkeConfig | None = None¶
The configurations for the GKE cluster of a Databricks workspace.
- is_no_public_ip_enabled: bool | None = None¶
Whether no public IP is enabled for the workspace.
- location: str | None = None¶
The Google Cloud region of the workspace data plane in your Google account (for example, us-east4).
- managed_services_customer_managed_key_id: str | None = None¶
ID of the key configuration for encrypting managed services.
- network_id: str | None = None¶
The network configuration ID that is attached to the workspace. This field is available only if the network is a customer-managed network.
- pricing_tier: PricingTier | None = None¶
The pricing tier of the workspace. For pricing tier information, see [AWS Pricing].
[AWS Pricing]: https://databricks.com/product/aws-pricing
- private_access_settings_id: str | None = None¶
ID of the workspace’s private access settings object. Only used for PrivateLink. You must specify this ID if you are using [AWS PrivateLink] for either front-end (user-to-workspace connection), back-end (data plane to control plane connection), or both connection types.
Before configuring PrivateLink, read the [Databricks article about PrivateLink].”,
[AWS PrivateLink]: https://aws.amazon.com/privatelink/ [Databricks article about PrivateLink]: https://docs.databricks.com/administration-guide/cloud-configurations/aws/privatelink.html
- storage_configuration_id: str | None = None¶
ID of the workspace’s storage configuration object.
- storage_customer_managed_key_id: str | None = None¶
ID of the key configuration for encrypting workspace storage.
- workspace_id: int | None = None¶
A unique integer ID for the workspace
- workspace_name: str | None = None¶
The human-readable name of the workspace.
- workspace_status: WorkspaceStatus | None = None¶
The status of the workspace. For workspace creation, usually it is set to PROVISIONING initially. Continue to check the status until the status is RUNNING.
- workspace_status_message: str | None = None¶
Message describing the current workspace status.
- as_dict() dict ¶
Serializes the Workspace into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict ¶
Serializes the Workspace into a shallow dictionary of its immediate attributes.
- class databricks.sdk.service.provisioning.WorkspaceStatus¶
The status of the workspace. For workspace creation, usually it is set to PROVISIONING initially. Continue to check the status until the status is RUNNING.
- BANNED = "BANNED"¶
- CANCELLING = "CANCELLING"¶
- FAILED = "FAILED"¶
- NOT_PROVISIONED = "NOT_PROVISIONED"¶
- PROVISIONING = "PROVISIONING"¶
- RUNNING = "RUNNING"¶