databricks.getCluster
Retrieves information about a databricks.Cluster using its id. This could be retrieved programmatically using databricks.getClusters data source.
This data source can only be used with a workspace-level provider!
Example Usage
Retrieve attributes of each SQL warehouses in a workspace
import * as pulumi from "@pulumi/pulumi";
import * as databricks from "@pulumi/databricks";
const all = databricks.getClusters({});
const allGetCluster = all.then(all => .reduce((__obj, [__key, __value]) => ({ ...__obj, [__key]: databricks.getCluster({
clusterId: __value,
}) })));
import pulumi
import pulumi_databricks as databricks
all = databricks.get_clusters()
all_get_cluster = {__key: databricks.get_cluster(cluster_id=__value) for __key, __value in all.ids}
Example coming soon!
using System.Collections.Generic;
using System.Linq;
using Pulumi;
using Databricks = Pulumi.Databricks;
return await Deployment.RunAsync(() =>
{
var all = Databricks.GetClusters.Invoke();
var allGetCluster = ;
});
Example coming soon!
Example coming soon!
Multiple clusters with the same name
When fetching a cluster whose name is not unique (including terminated but not permanently deleted clusters), you must use the cluster_id argument to uniquely identify the cluster. Combine this data source with databricks.getClusters to get the cluster_id of the cluster you want to fetch.
import * as pulumi from "@pulumi/pulumi";
import * as databricks from "@pulumi/databricks";
const myCluster = databricks.getClusters({
clusterNameContains: "my-cluster",
filterBy: {
clusterStates: ["RUNNING"],
},
});
const myClusterGetCluster = myCluster.then(myCluster => databricks.getCluster({
clusterId: myCluster.ids?.[0],
}));
import pulumi
import pulumi_databricks as databricks
my_cluster = databricks.get_clusters(cluster_name_contains="my-cluster",
filter_by={
"cluster_states": ["RUNNING"],
})
my_cluster_get_cluster = databricks.get_cluster(cluster_id=my_cluster.ids[0])
package main
import (
"github.com/pulumi/pulumi-databricks/sdk/go/databricks"
"github.com/pulumi/pulumi/sdk/v3/go/pulumi"
)
func main() {
pulumi.Run(func(ctx *pulumi.Context) error {
myCluster, err := databricks.GetClusters(ctx, &databricks.GetClustersArgs{
ClusterNameContains: pulumi.StringRef("my-cluster"),
FilterBy: databricks.GetClustersFilterBy{
ClusterStates: []string{
"RUNNING",
},
},
}, nil)
if err != nil {
return err
}
_, err = databricks.LookupCluster(ctx, &databricks.LookupClusterArgs{
ClusterId: pulumi.StringRef(myCluster.Ids[0]),
}, nil)
if err != nil {
return err
}
return nil
})
}
using System.Collections.Generic;
using System.Linq;
using Pulumi;
using Databricks = Pulumi.Databricks;
return await Deployment.RunAsync(() =>
{
var myCluster = Databricks.GetClusters.Invoke(new()
{
ClusterNameContains = "my-cluster",
FilterBy = new Databricks.Inputs.GetClustersFilterByInputArgs
{
ClusterStates = new[]
{
"RUNNING",
},
},
});
var myClusterGetCluster = Databricks.GetCluster.Invoke(new()
{
ClusterId = myCluster.Apply(getClustersResult => getClustersResult.Ids[0]),
});
});
package generated_program;
import com.pulumi.Context;
import com.pulumi.Pulumi;
import com.pulumi.core.Output;
import com.pulumi.databricks.DatabricksFunctions;
import com.pulumi.databricks.inputs.GetClustersArgs;
import com.pulumi.databricks.inputs.GetClustersFilterByArgs;
import com.pulumi.databricks.inputs.GetClusterArgs;
import java.util.List;
import java.util.ArrayList;
import java.util.Map;
import java.io.File;
import java.nio.file.Files;
import java.nio.file.Paths;
public class App {
public static void main(String[] args) {
Pulumi.run(App::stack);
}
public static void stack(Context ctx) {
final var myCluster = DatabricksFunctions.getClusters(GetClustersArgs.builder()
.clusterNameContains("my-cluster")
.filterBy(GetClustersFilterByArgs.builder()
.clusterStates("RUNNING")
.build())
.build());
final var myClusterGetCluster = DatabricksFunctions.getCluster(GetClusterArgs.builder()
.clusterId(myCluster.ids()[0])
.build());
}
}
variables:
myCluster:
fn::invoke:
function: databricks:getClusters
arguments:
clusterNameContains: my-cluster
filterBy:
clusterStates:
- RUNNING
myClusterGetCluster:
fn::invoke:
function: databricks:getCluster
arguments:
clusterId: ${myCluster.ids[0]}
Related Resources
The following resources are often used in the same context:
- End to end workspace management guide. * databricks.Cluster to create Databricks Clusters. * databricks.ClusterPolicy to create a databricks.Cluster policy, which limits the ability to create clusters based on a set of rules. * databricks.InstancePool to manage instance pools to reduce cluster start and auto-scaling times by maintaining a set of idle, ready-to-use instances. * databricks.Job to manage Databricks Jobs to run non-interactive code in a databricks_cluster. * databricks.Library to install a library on databricks_cluster. * databricks.Pipeline to deploy Lakeflow Declarative Pipelines.
Using getCluster
Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.
function getCluster(args: GetClusterArgs, opts?: InvokeOptions): Promise<GetClusterResult>
function getClusterOutput(args: GetClusterOutputArgs, opts?: InvokeOptions): Output<GetClusterResult>def get_cluster(cluster_id: Optional[str] = None,
cluster_info: Optional[GetClusterClusterInfo] = None,
cluster_name: Optional[str] = None,
id: Optional[str] = None,
provider_config: Optional[GetClusterProviderConfig] = None,
opts: Optional[InvokeOptions] = None) -> GetClusterResult
def get_cluster_output(cluster_id: Optional[pulumi.Input[str]] = None,
cluster_info: Optional[pulumi.Input[GetClusterClusterInfoArgs]] = None,
cluster_name: Optional[pulumi.Input[str]] = None,
id: Optional[pulumi.Input[str]] = None,
provider_config: Optional[pulumi.Input[GetClusterProviderConfigArgs]] = None,
opts: Optional[InvokeOptions] = None) -> Output[GetClusterResult]func LookupCluster(ctx *Context, args *LookupClusterArgs, opts ...InvokeOption) (*LookupClusterResult, error)
func LookupClusterOutput(ctx *Context, args *LookupClusterOutputArgs, opts ...InvokeOption) LookupClusterResultOutput> Note: This function is named LookupCluster in the Go SDK.
public static class GetCluster
{
public static Task<GetClusterResult> InvokeAsync(GetClusterArgs args, InvokeOptions? opts = null)
public static Output<GetClusterResult> Invoke(GetClusterInvokeArgs args, InvokeOptions? opts = null)
}public static CompletableFuture<GetClusterResult> getCluster(GetClusterArgs args, InvokeOptions options)
public static Output<GetClusterResult> getCluster(GetClusterArgs args, InvokeOptions options)
fn::invoke:
function: databricks:index/getCluster:getCluster
arguments:
# arguments dictionaryThe following arguments are supported:
- Cluster
Id string - The id of the cluster.
- Cluster
Info GetCluster Cluster Info - block, consisting of following fields:
- Cluster
Name string - The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- Id string
- cluster ID
- Provider
Config GetCluster Provider Config - Configure the provider for management through account provider. This block consists of the following fields:
- Cluster
Id string - The id of the cluster.
- Cluster
Info GetCluster Cluster Info - block, consisting of following fields:
- Cluster
Name string - The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- Id string
- cluster ID
- Provider
Config GetCluster Provider Config - Configure the provider for management through account provider. This block consists of the following fields:
- cluster
Id String - The id of the cluster.
- cluster
Info GetCluster Cluster Info - block, consisting of following fields:
- cluster
Name String - The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- id String
- cluster ID
- provider
Config GetCluster Provider Config - Configure the provider for management through account provider. This block consists of the following fields:
- cluster
Id string - The id of the cluster.
- cluster
Info GetCluster Cluster Info - block, consisting of following fields:
- cluster
Name string - The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- id string
- cluster ID
- provider
Config GetCluster Provider Config - Configure the provider for management through account provider. This block consists of the following fields:
- cluster_
id str - The id of the cluster.
- cluster_
info GetCluster Cluster Info - block, consisting of following fields:
- cluster_
name str - The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- id str
- cluster ID
- provider_
config GetCluster Provider Config - Configure the provider for management through account provider. This block consists of the following fields:
- cluster
Id String - The id of the cluster.
- cluster
Info Property Map - block, consisting of following fields:
- cluster
Name String - The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- id String
- cluster ID
- provider
Config Property Map - Configure the provider for management through account provider. This block consists of the following fields:
getCluster Result
The following output properties are available:
- Cluster
Id string - Cluster
Info GetCluster Cluster Info - block, consisting of following fields:
- Cluster
Name string - Cluster name, which doesn’t have to be unique.
- Id string
- cluster ID
- Provider
Config GetCluster Provider Config
- Cluster
Id string - Cluster
Info GetCluster Cluster Info - block, consisting of following fields:
- Cluster
Name string - Cluster name, which doesn’t have to be unique.
- Id string
- cluster ID
- Provider
Config GetCluster Provider Config
- cluster
Id String - cluster
Info GetCluster Cluster Info - block, consisting of following fields:
- cluster
Name String - Cluster name, which doesn’t have to be unique.
- id String
- cluster ID
- provider
Config GetCluster Provider Config
- cluster
Id string - cluster
Info GetCluster Cluster Info - block, consisting of following fields:
- cluster
Name string - Cluster name, which doesn’t have to be unique.
- id string
- cluster ID
- provider
Config GetCluster Provider Config
- cluster_
id str - cluster_
info GetCluster Cluster Info - block, consisting of following fields:
- cluster_
name str - Cluster name, which doesn’t have to be unique.
- id str
- cluster ID
- provider_
config GetCluster Provider Config
- cluster
Id String - cluster
Info Property Map - block, consisting of following fields:
- cluster
Name String - Cluster name, which doesn’t have to be unique.
- id String
- cluster ID
- provider
Config Property Map
Supporting Types
GetClusterClusterInfo
- Autoscale
Get
Cluster Cluster Info Autoscale - Autotermination
Minutes int - Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- Aws
Attributes GetCluster Cluster Info Aws Attributes - Azure
Attributes GetCluster Cluster Info Azure Attributes - Cluster
Cores double - Cluster
Id string - The id of the cluster.
- Cluster
Log GetConf Cluster Cluster Info Cluster Log Conf - Cluster
Log GetStatus Cluster Cluster Info Cluster Log Status - Cluster
Memory intMb - Cluster
Name string - The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- Cluster
Source string - Creator
User stringName - Dictionary<string, string>
- Additional tags for cluster resources.
- Data
Security stringMode - Security features of the cluster. Unity Catalog requires
SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled. - Dictionary<string, string>
- Docker
Image GetCluster Cluster Info Docker Image - Driver
Get
Cluster Cluster Info Driver - Driver
Instance stringPool Id - similar to
instance_pool_id, but for driver node. - Driver
Node stringType Id - The node type of the Spark driver.
- Enable
Elastic boolDisk - Use autoscaling local storage.
- Enable
Local boolDisk Encryption - Enable local disk encryption.
- Executors
List<Get
Cluster Cluster Info Executor> - Gcp
Attributes GetCluster Cluster Info Gcp Attributes - Init
Scripts List<GetCluster Cluster Info Init Script> - Instance
Pool stringId - The pool of idle instances the cluster is attached to.
- Is
Single boolNode - Jdbc
Port int - Kind string
- Last
Restarted intTime - Last
State intLoss Time - Node
Type stringId - Any supported databricks.getNodeType id.
- Num
Workers int - Policy
Id string - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- Remote
Disk intThroughput - Runtime
Engine string - The type of runtime of the cluster
- Single
User stringName - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- Spark
Conf Dictionary<string, string> - Map with key-value pairs to fine-tune Spark clusters.
- Spark
Context intId - Spark
Env Dictionary<string, string>Vars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- Spark
Version string - Runtime version of the cluster.
- Spec
Get
Cluster Cluster Info Spec - Ssh
Public List<string>Keys - SSH public key contents that will be added to each Spark node in this cluster.
- Start
Time int - State string
- State
Message string - Terminated
Time int - Termination
Reason GetCluster Cluster Info Termination Reason - Total
Initial intRemote Disk Size - Use
Ml boolRuntime - Workload
Type GetCluster Cluster Info Workload Type
- Autoscale
Get
Cluster Cluster Info Autoscale - Autotermination
Minutes int - Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- Aws
Attributes GetCluster Cluster Info Aws Attributes - Azure
Attributes GetCluster Cluster Info Azure Attributes - Cluster
Cores float64 - Cluster
Id string - The id of the cluster.
- Cluster
Log GetConf Cluster Cluster Info Cluster Log Conf - Cluster
Log GetStatus Cluster Cluster Info Cluster Log Status - Cluster
Memory intMb - Cluster
Name string - The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- Cluster
Source string - Creator
User stringName - map[string]string
- Additional tags for cluster resources.
- Data
Security stringMode - Security features of the cluster. Unity Catalog requires
SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled. - map[string]string
- Docker
Image GetCluster Cluster Info Docker Image - Driver
Get
Cluster Cluster Info Driver - Driver
Instance stringPool Id - similar to
instance_pool_id, but for driver node. - Driver
Node stringType Id - The node type of the Spark driver.
- Enable
Elastic boolDisk - Use autoscaling local storage.
- Enable
Local boolDisk Encryption - Enable local disk encryption.
- Executors
[]Get
Cluster Cluster Info Executor - Gcp
Attributes GetCluster Cluster Info Gcp Attributes - Init
Scripts []GetCluster Cluster Info Init Script - Instance
Pool stringId - The pool of idle instances the cluster is attached to.
- Is
Single boolNode - Jdbc
Port int - Kind string
- Last
Restarted intTime - Last
State intLoss Time - Node
Type stringId - Any supported databricks.getNodeType id.
- Num
Workers int - Policy
Id string - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- Remote
Disk intThroughput - Runtime
Engine string - The type of runtime of the cluster
- Single
User stringName - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- Spark
Conf map[string]string - Map with key-value pairs to fine-tune Spark clusters.
- Spark
Context intId - Spark
Env map[string]stringVars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- Spark
Version string - Runtime version of the cluster.
- Spec
Get
Cluster Cluster Info Spec - Ssh
Public []stringKeys - SSH public key contents that will be added to each Spark node in this cluster.
- Start
Time int - State string
- State
Message string - Terminated
Time int - Termination
Reason GetCluster Cluster Info Termination Reason - Total
Initial intRemote Disk Size - Use
Ml boolRuntime - Workload
Type GetCluster Cluster Info Workload Type
- autoscale
Get
Cluster Cluster Info Autoscale - autotermination
Minutes Integer - Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- aws
Attributes GetCluster Cluster Info Aws Attributes - azure
Attributes GetCluster Cluster Info Azure Attributes - cluster
Cores Double - cluster
Id String - The id of the cluster.
- cluster
Log GetConf Cluster Cluster Info Cluster Log Conf - cluster
Log GetStatus Cluster Cluster Info Cluster Log Status - cluster
Memory IntegerMb - cluster
Name String - The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- cluster
Source String - creator
User StringName - Map<String,String>
- Additional tags for cluster resources.
- data
Security StringMode - Security features of the cluster. Unity Catalog requires
SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled. - Map<String,String>
- docker
Image GetCluster Cluster Info Docker Image - driver
Get
Cluster Cluster Info Driver - driver
Instance StringPool Id - similar to
instance_pool_id, but for driver node. - driver
Node StringType Id - The node type of the Spark driver.
- enable
Elastic BooleanDisk - Use autoscaling local storage.
- enable
Local BooleanDisk Encryption - Enable local disk encryption.
- executors
List<Get
Cluster Cluster Info Executor> - gcp
Attributes GetCluster Cluster Info Gcp Attributes - init
Scripts List<GetCluster Cluster Info Init Script> - instance
Pool StringId - The pool of idle instances the cluster is attached to.
- is
Single BooleanNode - jdbc
Port Integer - kind String
- last
Restarted IntegerTime - last
State IntegerLoss Time - node
Type StringId - Any supported databricks.getNodeType id.
- num
Workers Integer - policy
Id String - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- remote
Disk IntegerThroughput - runtime
Engine String - The type of runtime of the cluster
- single
User StringName - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- spark
Conf Map<String,String> - Map with key-value pairs to fine-tune Spark clusters.
- spark
Context IntegerId - spark
Env Map<String,String>Vars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- spark
Version String - Runtime version of the cluster.
- spec
Get
Cluster Cluster Info Spec - ssh
Public List<String>Keys - SSH public key contents that will be added to each Spark node in this cluster.
- start
Time Integer - state String
- state
Message String - terminated
Time Integer - termination
Reason GetCluster Cluster Info Termination Reason - total
Initial IntegerRemote Disk Size - use
Ml BooleanRuntime - workload
Type GetCluster Cluster Info Workload Type
- autoscale
Get
Cluster Cluster Info Autoscale - autotermination
Minutes number - Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- aws
Attributes GetCluster Cluster Info Aws Attributes - azure
Attributes GetCluster Cluster Info Azure Attributes - cluster
Cores number - cluster
Id string - The id of the cluster.
- cluster
Log GetConf Cluster Cluster Info Cluster Log Conf - cluster
Log GetStatus Cluster Cluster Info Cluster Log Status - cluster
Memory numberMb - cluster
Name string - The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- cluster
Source string - creator
User stringName - {[key: string]: string}
- Additional tags for cluster resources.
- data
Security stringMode - Security features of the cluster. Unity Catalog requires
SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled. - {[key: string]: string}
- docker
Image GetCluster Cluster Info Docker Image - driver
Get
Cluster Cluster Info Driver - driver
Instance stringPool Id - similar to
instance_pool_id, but for driver node. - driver
Node stringType Id - The node type of the Spark driver.
- enable
Elastic booleanDisk - Use autoscaling local storage.
- enable
Local booleanDisk Encryption - Enable local disk encryption.
- executors
Get
Cluster Cluster Info Executor[] - gcp
Attributes GetCluster Cluster Info Gcp Attributes - init
Scripts GetCluster Cluster Info Init Script[] - instance
Pool stringId - The pool of idle instances the cluster is attached to.
- is
Single booleanNode - jdbc
Port number - kind string
- last
Restarted numberTime - last
State numberLoss Time - node
Type stringId - Any supported databricks.getNodeType id.
- num
Workers number - policy
Id string - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- remote
Disk numberThroughput - runtime
Engine string - The type of runtime of the cluster
- single
User stringName - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- spark
Conf {[key: string]: string} - Map with key-value pairs to fine-tune Spark clusters.
- spark
Context numberId - spark
Env {[key: string]: string}Vars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- spark
Version string - Runtime version of the cluster.
- spec
Get
Cluster Cluster Info Spec - ssh
Public string[]Keys - SSH public key contents that will be added to each Spark node in this cluster.
- start
Time number - state string
- state
Message string - terminated
Time number - termination
Reason GetCluster Cluster Info Termination Reason - total
Initial numberRemote Disk Size - use
Ml booleanRuntime - workload
Type GetCluster Cluster Info Workload Type
- autoscale
Get
Cluster Cluster Info Autoscale - autotermination_
minutes int - Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- aws_
attributes GetCluster Cluster Info Aws Attributes - azure_
attributes GetCluster Cluster Info Azure Attributes - cluster_
cores float - cluster_
id str - The id of the cluster.
- cluster_
log_ Getconf Cluster Cluster Info Cluster Log Conf - cluster_
log_ Getstatus Cluster Cluster Info Cluster Log Status - cluster_
memory_ intmb - cluster_
name str - The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- cluster_
source str - creator_
user_ strname - Mapping[str, str]
- Additional tags for cluster resources.
- data_
security_ strmode - Security features of the cluster. Unity Catalog requires
SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled. - Mapping[str, str]
- docker_
image GetCluster Cluster Info Docker Image - driver
Get
Cluster Cluster Info Driver - driver_
instance_ strpool_ id - similar to
instance_pool_id, but for driver node. - driver_
node_ strtype_ id - The node type of the Spark driver.
- enable_
elastic_ booldisk - Use autoscaling local storage.
- enable_
local_ booldisk_ encryption - Enable local disk encryption.
- executors
Sequence[Get
Cluster Cluster Info Executor] - gcp_
attributes GetCluster Cluster Info Gcp Attributes - init_
scripts Sequence[GetCluster Cluster Info Init Script] - instance_
pool_ strid - The pool of idle instances the cluster is attached to.
- is_
single_ boolnode - jdbc_
port int - kind str
- last_
restarted_ inttime - last_
state_ intloss_ time - node_
type_ strid - Any supported databricks.getNodeType id.
- num_
workers int - policy_
id str - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- remote_
disk_ intthroughput - runtime_
engine str - The type of runtime of the cluster
- single_
user_ strname - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- spark_
conf Mapping[str, str] - Map with key-value pairs to fine-tune Spark clusters.
- spark_
context_ intid - spark_
env_ Mapping[str, str]vars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- spark_
version str - Runtime version of the cluster.
- spec
Get
Cluster Cluster Info Spec - ssh_
public_ Sequence[str]keys - SSH public key contents that will be added to each Spark node in this cluster.
- start_
time int - state str
- state_
message str - terminated_
time int - termination_
reason GetCluster Cluster Info Termination Reason - total_
initial_ intremote_ disk_ size - use_
ml_ boolruntime - workload_
type GetCluster Cluster Info Workload Type
- autoscale Property Map
- autotermination
Minutes Number - Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- aws
Attributes Property Map - azure
Attributes Property Map - cluster
Cores Number - cluster
Id String - The id of the cluster.
- cluster
Log Property MapConf - cluster
Log Property MapStatus - cluster
Memory NumberMb - cluster
Name String - The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- cluster
Source String - creator
User StringName - Map<String>
- Additional tags for cluster resources.
- data
Security StringMode - Security features of the cluster. Unity Catalog requires
SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled. - Map<String>
- docker
Image Property Map - driver Property Map
- driver
Instance StringPool Id - similar to
instance_pool_id, but for driver node. - driver
Node StringType Id - The node type of the Spark driver.
- enable
Elastic BooleanDisk - Use autoscaling local storage.
- enable
Local BooleanDisk Encryption - Enable local disk encryption.
- executors List<Property Map>
- gcp
Attributes Property Map - init
Scripts List<Property Map> - instance
Pool StringId - The pool of idle instances the cluster is attached to.
- is
Single BooleanNode - jdbc
Port Number - kind String
- last
Restarted NumberTime - last
State NumberLoss Time - node
Type StringId - Any supported databricks.getNodeType id.
- num
Workers Number - policy
Id String - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- remote
Disk NumberThroughput - runtime
Engine String - The type of runtime of the cluster
- single
User StringName - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- spark
Conf Map<String> - Map with key-value pairs to fine-tune Spark clusters.
- spark
Context NumberId - spark
Env Map<String>Vars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- spark
Version String - Runtime version of the cluster.
- spec Property Map
- ssh
Public List<String>Keys - SSH public key contents that will be added to each Spark node in this cluster.
- start
Time Number - state String
- state
Message String - terminated
Time Number - termination
Reason Property Map - total
Initial NumberRemote Disk Size - use
Ml BooleanRuntime - workload
Type Property Map
GetClusterClusterInfoAutoscale
- Max
Workers int - Min
Workers int
- Max
Workers int - Min
Workers int
- max
Workers Integer - min
Workers Integer
- max
Workers number - min
Workers number
- max_
workers int - min_
workers int
- max
Workers Number - min
Workers Number
GetClusterClusterInfoAwsAttributes
- Availability string
- Ebs
Volume intCount - Ebs
Volume intIops - Ebs
Volume intSize - Ebs
Volume intThroughput - Ebs
Volume stringType - First
On intDemand - Instance
Profile stringArn - Spot
Bid intPrice Percent - Zone
Id string
- Availability string
- Ebs
Volume intCount - Ebs
Volume intIops - Ebs
Volume intSize - Ebs
Volume intThroughput - Ebs
Volume stringType - First
On intDemand - Instance
Profile stringArn - Spot
Bid intPrice Percent - Zone
Id string
- availability String
- ebs
Volume IntegerCount - ebs
Volume IntegerIops - ebs
Volume IntegerSize - ebs
Volume IntegerThroughput - ebs
Volume StringType - first
On IntegerDemand - instance
Profile StringArn - spot
Bid IntegerPrice Percent - zone
Id String
- availability string
- ebs
Volume numberCount - ebs
Volume numberIops - ebs
Volume numberSize - ebs
Volume numberThroughput - ebs
Volume stringType - first
On numberDemand - instance
Profile stringArn - spot
Bid numberPrice Percent - zone
Id string
- availability str
- ebs_
volume_ intcount - ebs_
volume_ intiops - ebs_
volume_ intsize - ebs_
volume_ intthroughput - ebs_
volume_ strtype - first_
on_ intdemand - instance_
profile_ strarn - spot_
bid_ intprice_ percent - zone_
id str
- availability String
- ebs
Volume NumberCount - ebs
Volume NumberIops - ebs
Volume NumberSize - ebs
Volume NumberThroughput - ebs
Volume StringType - first
On NumberDemand - instance
Profile StringArn - spot
Bid NumberPrice Percent - zone
Id String
GetClusterClusterInfoAzureAttributes
- availability String
- first
On NumberDemand - log
Analytics Property MapInfo - spot
Bid NumberMax Price
GetClusterClusterInfoAzureAttributesLogAnalyticsInfo
- Log
Analytics stringPrimary Key - Log
Analytics stringWorkspace Id
- Log
Analytics stringPrimary Key - Log
Analytics stringWorkspace Id
- log
Analytics StringPrimary Key - log
Analytics StringWorkspace Id
- log
Analytics stringPrimary Key - log
Analytics stringWorkspace Id
- log
Analytics StringPrimary Key - log
Analytics StringWorkspace Id
GetClusterClusterInfoClusterLogConf
GetClusterClusterInfoClusterLogConfDbfs
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoClusterLogConfS3
- Destination string
- Canned
Acl string - Enable
Encryption bool - Encryption
Type string - Endpoint string
- Kms
Key string - Region string
- Destination string
- Canned
Acl string - Enable
Encryption bool - Encryption
Type string - Endpoint string
- Kms
Key string - Region string
- destination String
- canned
Acl String - enable
Encryption Boolean - encryption
Type String - endpoint String
- kms
Key String - region String
- destination string
- canned
Acl string - enable
Encryption boolean - encryption
Type string - endpoint string
- kms
Key string - region string
- destination str
- canned_
acl str - enable_
encryption bool - encryption_
type str - endpoint str
- kms_
key str - region str
- destination String
- canned
Acl String - enable
Encryption Boolean - encryption
Type String - endpoint String
- kms
Key String - region String
GetClusterClusterInfoClusterLogConfVolumes
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoClusterLogStatus
- Last
Attempted int - Last
Exception string
- Last
Attempted int - Last
Exception string
- last
Attempted Integer - last
Exception String
- last
Attempted number - last
Exception string
- last_
attempted int - last_
exception str
- last
Attempted Number - last
Exception String
GetClusterClusterInfoDockerImage
- basic
Auth Property Map - url String
GetClusterClusterInfoDockerImageBasicAuth
GetClusterClusterInfoDriver
- Host
Private stringIp - Instance
Id string - Node
Aws GetAttributes Cluster Cluster Info Driver Node Aws Attributes - Node
Id string - Private
Ip string - Public
Dns string - Start
Timestamp int
- Host
Private stringIp - Instance
Id string - Node
Aws GetAttributes Cluster Cluster Info Driver Node Aws Attributes - Node
Id string - Private
Ip string - Public
Dns string - Start
Timestamp int
- host
Private StringIp - instance
Id String - node
Aws GetAttributes Cluster Cluster Info Driver Node Aws Attributes - node
Id String - private
Ip String - public
Dns String - start
Timestamp Integer
- host
Private stringIp - instance
Id string - node
Aws GetAttributes Cluster Cluster Info Driver Node Aws Attributes - node
Id string - private
Ip string - public
Dns string - start
Timestamp number
- host
Private StringIp - instance
Id String - node
Aws Property MapAttributes - node
Id String - private
Ip String - public
Dns String - start
Timestamp Number
GetClusterClusterInfoDriverNodeAwsAttributes
- Is
Spot bool
- Is
Spot bool
- is
Spot Boolean
- is
Spot boolean
- is_
spot bool
- is
Spot Boolean
GetClusterClusterInfoExecutor
- Host
Private stringIp - Instance
Id string - Node
Aws GetAttributes Cluster Cluster Info Executor Node Aws Attributes - Node
Id string - Private
Ip string - Public
Dns string - Start
Timestamp int
- Host
Private stringIp - Instance
Id string - Node
Aws GetAttributes Cluster Cluster Info Executor Node Aws Attributes - Node
Id string - Private
Ip string - Public
Dns string - Start
Timestamp int
- host
Private StringIp - instance
Id String - node
Aws GetAttributes Cluster Cluster Info Executor Node Aws Attributes - node
Id String - private
Ip String - public
Dns String - start
Timestamp Integer
- host
Private stringIp - instance
Id string - node
Aws GetAttributes Cluster Cluster Info Executor Node Aws Attributes - node
Id string - private
Ip string - public
Dns string - start
Timestamp number
- host
Private StringIp - instance
Id String - node
Aws Property MapAttributes - node
Id String - private
Ip String - public
Dns String - start
Timestamp Number
GetClusterClusterInfoExecutorNodeAwsAttributes
- Is
Spot bool
- Is
Spot bool
- is
Spot Boolean
- is
Spot boolean
- is_
spot bool
- is
Spot Boolean
GetClusterClusterInfoGcpAttributes
- Availability string
- Boot
Disk intSize - First
On intDemand - Google
Service stringAccount - Local
Ssd intCount - Use
Preemptible boolExecutors - Zone
Id string
- Availability string
- Boot
Disk intSize - First
On intDemand - Google
Service stringAccount - Local
Ssd intCount - Use
Preemptible boolExecutors - Zone
Id string
- availability String
- boot
Disk IntegerSize - first
On IntegerDemand - google
Service StringAccount - local
Ssd IntegerCount - use
Preemptible BooleanExecutors - zone
Id String
- availability string
- boot
Disk numberSize - first
On numberDemand - google
Service stringAccount - local
Ssd numberCount - use
Preemptible booleanExecutors - zone
Id string
- availability str
- boot_
disk_ intsize - first_
on_ intdemand - google_
service_ straccount - local_
ssd_ intcount - use_
preemptible_ boolexecutors - zone_
id str
- availability String
- boot
Disk NumberSize - first
On NumberDemand - google
Service StringAccount - local
Ssd NumberCount - use
Preemptible BooleanExecutors - zone
Id String
GetClusterClusterInfoInitScript
GetClusterClusterInfoInitScriptAbfss
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoInitScriptDbfs
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoInitScriptFile
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoInitScriptGcs
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoInitScriptS3
- Destination string
- Canned
Acl string - Enable
Encryption bool - Encryption
Type string - Endpoint string
- Kms
Key string - Region string
- Destination string
- Canned
Acl string - Enable
Encryption bool - Encryption
Type string - Endpoint string
- Kms
Key string - Region string
- destination String
- canned
Acl String - enable
Encryption Boolean - encryption
Type String - endpoint String
- kms
Key String - region String
- destination string
- canned
Acl string - enable
Encryption boolean - encryption
Type string - endpoint string
- kms
Key string - region string
- destination str
- canned_
acl str - enable_
encryption bool - encryption_
type str - endpoint str
- kms_
key str - region str
- destination String
- canned
Acl String - enable
Encryption Boolean - encryption
Type String - endpoint String
- kms
Key String - region String
GetClusterClusterInfoInitScriptVolumes
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoInitScriptWorkspace
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpec
- Cluster
Id string - The id of the cluster.
- Driver
Instance stringPool Id - similar to
instance_pool_id, but for driver node. - Driver
Node stringType Id - The node type of the Spark driver.
- Enable
Elastic boolDisk - Use autoscaling local storage.
- Enable
Local boolDisk Encryption - Enable local disk encryption.
- Node
Type stringId - Any supported databricks.getNodeType id.
- Apply
Policy boolDefault Values - Autoscale
Get
Cluster Cluster Info Spec Autoscale - Aws
Attributes GetCluster Cluster Info Spec Aws Attributes - Azure
Attributes GetCluster Cluster Info Spec Azure Attributes - Cluster
Log GetConf Cluster Cluster Info Spec Cluster Log Conf - Cluster
Mount List<GetInfos Cluster Cluster Info Spec Cluster Mount Info> - Cluster
Name string - The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- Dictionary<string, string>
- Additional tags for cluster resources.
- Data
Security stringMode - Security features of the cluster. Unity Catalog requires
SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled. - Docker
Image GetCluster Cluster Info Spec Docker Image - Gcp
Attributes GetCluster Cluster Info Spec Gcp Attributes - Idempotency
Token string - An optional token to guarantee the idempotency of cluster creation requests.
- Init
Scripts List<GetCluster Cluster Info Spec Init Script> - Instance
Pool stringId - The pool of idle instances the cluster is attached to.
- Is
Single boolNode - Kind string
- Libraries
List<Get
Cluster Cluster Info Spec Library> - Num
Workers int - Policy
Id string - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- Provider
Config GetCluster Cluster Info Spec Provider Config - Configure the provider for management through account provider. This block consists of the following fields:
- Remote
Disk intThroughput - Runtime
Engine string - The type of runtime of the cluster
- Single
User stringName - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- Spark
Conf Dictionary<string, string> - Map with key-value pairs to fine-tune Spark clusters.
- Spark
Env Dictionary<string, string>Vars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- Spark
Version string - Runtime version of the cluster.
- Ssh
Public List<string>Keys - SSH public key contents that will be added to each Spark node in this cluster.
- Total
Initial intRemote Disk Size - Use
Ml boolRuntime - Workload
Type GetCluster Cluster Info Spec Workload Type
- Cluster
Id string - The id of the cluster.
- Driver
Instance stringPool Id - similar to
instance_pool_id, but for driver node. - Driver
Node stringType Id - The node type of the Spark driver.
- Enable
Elastic boolDisk - Use autoscaling local storage.
- Enable
Local boolDisk Encryption - Enable local disk encryption.
- Node
Type stringId - Any supported databricks.getNodeType id.
- Apply
Policy boolDefault Values - Autoscale
Get
Cluster Cluster Info Spec Autoscale - Aws
Attributes GetCluster Cluster Info Spec Aws Attributes - Azure
Attributes GetCluster Cluster Info Spec Azure Attributes - Cluster
Log GetConf Cluster Cluster Info Spec Cluster Log Conf - Cluster
Mount []GetInfos Cluster Cluster Info Spec Cluster Mount Info - Cluster
Name string - The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- map[string]string
- Additional tags for cluster resources.
- Data
Security stringMode - Security features of the cluster. Unity Catalog requires
SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled. - Docker
Image GetCluster Cluster Info Spec Docker Image - Gcp
Attributes GetCluster Cluster Info Spec Gcp Attributes - Idempotency
Token string - An optional token to guarantee the idempotency of cluster creation requests.
- Init
Scripts []GetCluster Cluster Info Spec Init Script - Instance
Pool stringId - The pool of idle instances the cluster is attached to.
- Is
Single boolNode - Kind string
- Libraries
[]Get
Cluster Cluster Info Spec Library - Num
Workers int - Policy
Id string - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- Provider
Config GetCluster Cluster Info Spec Provider Config - Configure the provider for management through account provider. This block consists of the following fields:
- Remote
Disk intThroughput - Runtime
Engine string - The type of runtime of the cluster
- Single
User stringName - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- Spark
Conf map[string]string - Map with key-value pairs to fine-tune Spark clusters.
- Spark
Env map[string]stringVars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- Spark
Version string - Runtime version of the cluster.
- Ssh
Public []stringKeys - SSH public key contents that will be added to each Spark node in this cluster.
- Total
Initial intRemote Disk Size - Use
Ml boolRuntime - Workload
Type GetCluster Cluster Info Spec Workload Type
- cluster
Id String - The id of the cluster.
- driver
Instance StringPool Id - similar to
instance_pool_id, but for driver node. - driver
Node StringType Id - The node type of the Spark driver.
- enable
Elastic BooleanDisk - Use autoscaling local storage.
- enable
Local BooleanDisk Encryption - Enable local disk encryption.
- node
Type StringId - Any supported databricks.getNodeType id.
- apply
Policy BooleanDefault Values - autoscale
Get
Cluster Cluster Info Spec Autoscale - aws
Attributes GetCluster Cluster Info Spec Aws Attributes - azure
Attributes GetCluster Cluster Info Spec Azure Attributes - cluster
Log GetConf Cluster Cluster Info Spec Cluster Log Conf - cluster
Mount List<GetInfos Cluster Cluster Info Spec Cluster Mount Info> - cluster
Name String - The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- Map<String,String>
- Additional tags for cluster resources.
- data
Security StringMode - Security features of the cluster. Unity Catalog requires
SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled. - docker
Image GetCluster Cluster Info Spec Docker Image - gcp
Attributes GetCluster Cluster Info Spec Gcp Attributes - idempotency
Token String - An optional token to guarantee the idempotency of cluster creation requests.
- init
Scripts List<GetCluster Cluster Info Spec Init Script> - instance
Pool StringId - The pool of idle instances the cluster is attached to.
- is
Single BooleanNode - kind String
- libraries
List<Get
Cluster Cluster Info Spec Library> - num
Workers Integer - policy
Id String - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- provider
Config GetCluster Cluster Info Spec Provider Config - Configure the provider for management through account provider. This block consists of the following fields:
- remote
Disk IntegerThroughput - runtime
Engine String - The type of runtime of the cluster
- single
User StringName - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- spark
Conf Map<String,String> - Map with key-value pairs to fine-tune Spark clusters.
- spark
Env Map<String,String>Vars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- spark
Version String - Runtime version of the cluster.
- ssh
Public List<String>Keys - SSH public key contents that will be added to each Spark node in this cluster.
- total
Initial IntegerRemote Disk Size - use
Ml BooleanRuntime - workload
Type GetCluster Cluster Info Spec Workload Type
- cluster
Id string - The id of the cluster.
- driver
Instance stringPool Id - similar to
instance_pool_id, but for driver node. - driver
Node stringType Id - The node type of the Spark driver.
- enable
Elastic booleanDisk - Use autoscaling local storage.
- enable
Local booleanDisk Encryption - Enable local disk encryption.
- node
Type stringId - Any supported databricks.getNodeType id.
- apply
Policy booleanDefault Values - autoscale
Get
Cluster Cluster Info Spec Autoscale - aws
Attributes GetCluster Cluster Info Spec Aws Attributes - azure
Attributes GetCluster Cluster Info Spec Azure Attributes - cluster
Log GetConf Cluster Cluster Info Spec Cluster Log Conf - cluster
Mount GetInfos Cluster Cluster Info Spec Cluster Mount Info[] - cluster
Name string - The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- {[key: string]: string}
- Additional tags for cluster resources.
- data
Security stringMode - Security features of the cluster. Unity Catalog requires
SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled. - docker
Image GetCluster Cluster Info Spec Docker Image - gcp
Attributes GetCluster Cluster Info Spec Gcp Attributes - idempotency
Token string - An optional token to guarantee the idempotency of cluster creation requests.
- init
Scripts GetCluster Cluster Info Spec Init Script[] - instance
Pool stringId - The pool of idle instances the cluster is attached to.
- is
Single booleanNode - kind string
- libraries
Get
Cluster Cluster Info Spec Library[] - num
Workers number - policy
Id string - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- provider
Config GetCluster Cluster Info Spec Provider Config - Configure the provider for management through account provider. This block consists of the following fields:
- remote
Disk numberThroughput - runtime
Engine string - The type of runtime of the cluster
- single
User stringName - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- spark
Conf {[key: string]: string} - Map with key-value pairs to fine-tune Spark clusters.
- spark
Env {[key: string]: string}Vars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- spark
Version string - Runtime version of the cluster.
- ssh
Public string[]Keys - SSH public key contents that will be added to each Spark node in this cluster.
- total
Initial numberRemote Disk Size - use
Ml booleanRuntime - workload
Type GetCluster Cluster Info Spec Workload Type
- cluster_
id str - The id of the cluster.
- driver_
instance_ strpool_ id - similar to
instance_pool_id, but for driver node. - driver_
node_ strtype_ id - The node type of the Spark driver.
- enable_
elastic_ booldisk - Use autoscaling local storage.
- enable_
local_ booldisk_ encryption - Enable local disk encryption.
- node_
type_ strid - Any supported databricks.getNodeType id.
- apply_
policy_ booldefault_ values - autoscale
Get
Cluster Cluster Info Spec Autoscale - aws_
attributes GetCluster Cluster Info Spec Aws Attributes - azure_
attributes GetCluster Cluster Info Spec Azure Attributes - cluster_
log_ Getconf Cluster Cluster Info Spec Cluster Log Conf - cluster_
mount_ Sequence[Getinfos Cluster Cluster Info Spec Cluster Mount Info] - cluster_
name str - The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- Mapping[str, str]
- Additional tags for cluster resources.
- data_
security_ strmode - Security features of the cluster. Unity Catalog requires
SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled. - docker_
image GetCluster Cluster Info Spec Docker Image - gcp_
attributes GetCluster Cluster Info Spec Gcp Attributes - idempotency_
token str - An optional token to guarantee the idempotency of cluster creation requests.
- init_
scripts Sequence[GetCluster Cluster Info Spec Init Script] - instance_
pool_ strid - The pool of idle instances the cluster is attached to.
- is_
single_ boolnode - kind str
- libraries
Sequence[Get
Cluster Cluster Info Spec Library] - num_
workers int - policy_
id str - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- provider_
config GetCluster Cluster Info Spec Provider Config - Configure the provider for management through account provider. This block consists of the following fields:
- remote_
disk_ intthroughput - runtime_
engine str - The type of runtime of the cluster
- single_
user_ strname - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- spark_
conf Mapping[str, str] - Map with key-value pairs to fine-tune Spark clusters.
- spark_
env_ Mapping[str, str]vars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- spark_
version str - Runtime version of the cluster.
- ssh_
public_ Sequence[str]keys - SSH public key contents that will be added to each Spark node in this cluster.
- total_
initial_ intremote_ disk_ size - use_
ml_ boolruntime - workload_
type GetCluster Cluster Info Spec Workload Type
- cluster
Id String - The id of the cluster.
- driver
Instance StringPool Id - similar to
instance_pool_id, but for driver node. - driver
Node StringType Id - The node type of the Spark driver.
- enable
Elastic BooleanDisk - Use autoscaling local storage.
- enable
Local BooleanDisk Encryption - Enable local disk encryption.
- node
Type StringId - Any supported databricks.getNodeType id.
- apply
Policy BooleanDefault Values - autoscale Property Map
- aws
Attributes Property Map - azure
Attributes Property Map - cluster
Log Property MapConf - cluster
Mount List<Property Map>Infos - cluster
Name String - The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
- Map<String>
- Additional tags for cluster resources.
- data
Security StringMode - Security features of the cluster. Unity Catalog requires
SINGLE_USERorUSER_ISOLATIONmode.LEGACY_PASSTHROUGHfor passthrough cluster andLEGACY_TABLE_ACLfor Table ACL cluster. Default toNONE, i.e. no security feature enabled. - docker
Image Property Map - gcp
Attributes Property Map - idempotency
Token String - An optional token to guarantee the idempotency of cluster creation requests.
- init
Scripts List<Property Map> - instance
Pool StringId - The pool of idle instances the cluster is attached to.
- is
Single BooleanNode - kind String
- libraries List<Property Map>
- num
Workers Number - policy
Id String - Identifier of Cluster Policy to validate cluster and preset certain defaults.
- provider
Config Property Map - Configure the provider for management through account provider. This block consists of the following fields:
- remote
Disk NumberThroughput - runtime
Engine String - The type of runtime of the cluster
- single
User StringName - The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- spark
Conf Map<String> - Map with key-value pairs to fine-tune Spark clusters.
- spark
Env Map<String>Vars - Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- spark
Version String - Runtime version of the cluster.
- ssh
Public List<String>Keys - SSH public key contents that will be added to each Spark node in this cluster.
- total
Initial NumberRemote Disk Size - use
Ml BooleanRuntime - workload
Type Property Map
GetClusterClusterInfoSpecAutoscale
- Max
Workers int - Min
Workers int
- Max
Workers int - Min
Workers int
- max
Workers Integer - min
Workers Integer
- max
Workers number - min
Workers number
- max_
workers int - min_
workers int
- max
Workers Number - min
Workers Number
GetClusterClusterInfoSpecAwsAttributes
- Availability string
- Ebs
Volume intCount - Ebs
Volume intIops - Ebs
Volume intSize - Ebs
Volume intThroughput - Ebs
Volume stringType - First
On intDemand - Instance
Profile stringArn - Spot
Bid intPrice Percent - Zone
Id string
- Availability string
- Ebs
Volume intCount - Ebs
Volume intIops - Ebs
Volume intSize - Ebs
Volume intThroughput - Ebs
Volume stringType - First
On intDemand - Instance
Profile stringArn - Spot
Bid intPrice Percent - Zone
Id string
- availability String
- ebs
Volume IntegerCount - ebs
Volume IntegerIops - ebs
Volume IntegerSize - ebs
Volume IntegerThroughput - ebs
Volume StringType - first
On IntegerDemand - instance
Profile StringArn - spot
Bid IntegerPrice Percent - zone
Id String
- availability string
- ebs
Volume numberCount - ebs
Volume numberIops - ebs
Volume numberSize - ebs
Volume numberThroughput - ebs
Volume stringType - first
On numberDemand - instance
Profile stringArn - spot
Bid numberPrice Percent - zone
Id string
- availability str
- ebs_
volume_ intcount - ebs_
volume_ intiops - ebs_
volume_ intsize - ebs_
volume_ intthroughput - ebs_
volume_ strtype - first_
on_ intdemand - instance_
profile_ strarn - spot_
bid_ intprice_ percent - zone_
id str
- availability String
- ebs
Volume NumberCount - ebs
Volume NumberIops - ebs
Volume NumberSize - ebs
Volume NumberThroughput - ebs
Volume StringType - first
On NumberDemand - instance
Profile StringArn - spot
Bid NumberPrice Percent - zone
Id String
GetClusterClusterInfoSpecAzureAttributes
- availability String
- first
On NumberDemand - log
Analytics Property MapInfo - spot
Bid NumberMax Price
GetClusterClusterInfoSpecAzureAttributesLogAnalyticsInfo
- Log
Analytics stringPrimary Key - Log
Analytics stringWorkspace Id
- Log
Analytics stringPrimary Key - Log
Analytics stringWorkspace Id
- log
Analytics StringPrimary Key - log
Analytics StringWorkspace Id
- log
Analytics stringPrimary Key - log
Analytics stringWorkspace Id
- log
Analytics StringPrimary Key - log
Analytics StringWorkspace Id
GetClusterClusterInfoSpecClusterLogConf
GetClusterClusterInfoSpecClusterLogConfDbfs
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecClusterLogConfS3
- Destination string
- Canned
Acl string - Enable
Encryption bool - Encryption
Type string - Endpoint string
- Kms
Key string - Region string
- Destination string
- Canned
Acl string - Enable
Encryption bool - Encryption
Type string - Endpoint string
- Kms
Key string - Region string
- destination String
- canned
Acl String - enable
Encryption Boolean - encryption
Type String - endpoint String
- kms
Key String - region String
- destination string
- canned
Acl string - enable
Encryption boolean - encryption
Type string - endpoint string
- kms
Key string - region string
- destination str
- canned_
acl str - enable_
encryption bool - encryption_
type str - endpoint str
- kms_
key str - region str
- destination String
- canned
Acl String - enable
Encryption Boolean - encryption
Type String - endpoint String
- kms
Key String - region String
GetClusterClusterInfoSpecClusterLogConfVolumes
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecClusterMountInfo
GetClusterClusterInfoSpecClusterMountInfoNetworkFilesystemInfo
- Server
Address string - Mount
Options string
- Server
Address string - Mount
Options string
- server
Address String - mount
Options String
- server
Address string - mount
Options string
- server_
address str - mount_
options str
- server
Address String - mount
Options String
GetClusterClusterInfoSpecDockerImage
- url String
- basic
Auth Property Map
GetClusterClusterInfoSpecDockerImageBasicAuth
GetClusterClusterInfoSpecGcpAttributes
- Availability string
- Boot
Disk intSize - First
On intDemand - Google
Service stringAccount - Local
Ssd intCount - Use
Preemptible boolExecutors - Zone
Id string
- Availability string
- Boot
Disk intSize - First
On intDemand - Google
Service stringAccount - Local
Ssd intCount - Use
Preemptible boolExecutors - Zone
Id string
- availability String
- boot
Disk IntegerSize - first
On IntegerDemand - google
Service StringAccount - local
Ssd IntegerCount - use
Preemptible BooleanExecutors - zone
Id String
- availability string
- boot
Disk numberSize - first
On numberDemand - google
Service stringAccount - local
Ssd numberCount - use
Preemptible booleanExecutors - zone
Id string
- availability str
- boot_
disk_ intsize - first_
on_ intdemand - google_
service_ straccount - local_
ssd_ intcount - use_
preemptible_ boolexecutors - zone_
id str
- availability String
- boot
Disk NumberSize - first
On NumberDemand - google
Service StringAccount - local
Ssd NumberCount - use
Preemptible BooleanExecutors - zone
Id String
GetClusterClusterInfoSpecInitScript
- Abfss
Get
Cluster Cluster Info Spec Init Script Abfss - Dbfs
Get
Cluster Cluster Info Spec Init Script Dbfs - File
Get
Cluster Cluster Info Spec Init Script File - Gcs
Get
Cluster Cluster Info Spec Init Script Gcs - S3
Get
Cluster Cluster Info Spec Init Script S3 - Volumes
Get
Cluster Cluster Info Spec Init Script Volumes - Workspace
Get
Cluster Cluster Info Spec Init Script Workspace
- Abfss
Get
Cluster Cluster Info Spec Init Script Abfss - Dbfs
Get
Cluster Cluster Info Spec Init Script Dbfs - File
Get
Cluster Cluster Info Spec Init Script File - Gcs
Get
Cluster Cluster Info Spec Init Script Gcs - S3
Get
Cluster Cluster Info Spec Init Script S3 - Volumes
Get
Cluster Cluster Info Spec Init Script Volumes - Workspace
Get
Cluster Cluster Info Spec Init Script Workspace
- abfss
Get
Cluster Cluster Info Spec Init Script Abfss - dbfs
Get
Cluster Cluster Info Spec Init Script Dbfs - file
Get
Cluster Cluster Info Spec Init Script File - gcs
Get
Cluster Cluster Info Spec Init Script Gcs - s3
Get
Cluster Cluster Info Spec Init Script S3 - volumes
Get
Cluster Cluster Info Spec Init Script Volumes - workspace
Get
Cluster Cluster Info Spec Init Script Workspace
- abfss
Get
Cluster Cluster Info Spec Init Script Abfss - dbfs
Get
Cluster Cluster Info Spec Init Script Dbfs - file
Get
Cluster Cluster Info Spec Init Script File - gcs
Get
Cluster Cluster Info Spec Init Script Gcs - s3
Get
Cluster Cluster Info Spec Init Script S3 - volumes
Get
Cluster Cluster Info Spec Init Script Volumes - workspace
Get
Cluster Cluster Info Spec Init Script Workspace
- abfss
Get
Cluster Cluster Info Spec Init Script Abfss - dbfs
Get
Cluster Cluster Info Spec Init Script Dbfs - file
Get
Cluster Cluster Info Spec Init Script File - gcs
Get
Cluster Cluster Info Spec Init Script Gcs - s3
Get
Cluster Cluster Info Spec Init Script S3 - volumes
Get
Cluster Cluster Info Spec Init Script Volumes - workspace
Get
Cluster Cluster Info Spec Init Script Workspace
GetClusterClusterInfoSpecInitScriptAbfss
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecInitScriptDbfs
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecInitScriptFile
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecInitScriptGcs
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecInitScriptS3
- Destination string
- Canned
Acl string - Enable
Encryption bool - Encryption
Type string - Endpoint string
- Kms
Key string - Region string
- Destination string
- Canned
Acl string - Enable
Encryption bool - Encryption
Type string - Endpoint string
- Kms
Key string - Region string
- destination String
- canned
Acl String - enable
Encryption Boolean - encryption
Type String - endpoint String
- kms
Key String - region String
- destination string
- canned
Acl string - enable
Encryption boolean - encryption
Type string - endpoint string
- kms
Key string - region string
- destination str
- canned_
acl str - enable_
encryption bool - encryption_
type str - endpoint str
- kms_
key str - region str
- destination String
- canned
Acl String - enable
Encryption Boolean - encryption
Type String - endpoint String
- kms
Key String - region String
GetClusterClusterInfoSpecInitScriptVolumes
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecInitScriptWorkspace
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoSpecLibrary
- Cran
Get
Cluster Cluster Info Spec Library Cran - Egg string
- Jar string
- Maven
Get
Cluster Cluster Info Spec Library Maven - Provider
Config GetCluster Cluster Info Spec Library Provider Config - Configure the provider for management through account provider. This block consists of the following fields:
- Pypi
Get
Cluster Cluster Info Spec Library Pypi - Requirements string
- Whl string
- Cran
Get
Cluster Cluster Info Spec Library Cran - Egg string
- Jar string
- Maven
Get
Cluster Cluster Info Spec Library Maven - Provider
Config GetCluster Cluster Info Spec Library Provider Config - Configure the provider for management through account provider. This block consists of the following fields:
- Pypi
Get
Cluster Cluster Info Spec Library Pypi - Requirements string
- Whl string
- cran
Get
Cluster Cluster Info Spec Library Cran - egg String
- jar String
- maven
Get
Cluster Cluster Info Spec Library Maven - provider
Config GetCluster Cluster Info Spec Library Provider Config - Configure the provider for management through account provider. This block consists of the following fields:
- pypi
Get
Cluster Cluster Info Spec Library Pypi - requirements String
- whl String
- cran
Get
Cluster Cluster Info Spec Library Cran - egg string
- jar string
- maven
Get
Cluster Cluster Info Spec Library Maven - provider
Config GetCluster Cluster Info Spec Library Provider Config - Configure the provider for management through account provider. This block consists of the following fields:
- pypi
Get
Cluster Cluster Info Spec Library Pypi - requirements string
- whl string
- cran
Get
Cluster Cluster Info Spec Library Cran - egg str
- jar str
- maven
Get
Cluster Cluster Info Spec Library Maven - provider_
config GetCluster Cluster Info Spec Library Provider Config - Configure the provider for management through account provider. This block consists of the following fields:
- pypi
Get
Cluster Cluster Info Spec Library Pypi - requirements str
- whl str
- cran Property Map
- egg String
- jar String
- maven Property Map
- provider
Config Property Map - Configure the provider for management through account provider. This block consists of the following fields:
- pypi Property Map
- requirements String
- whl String
GetClusterClusterInfoSpecLibraryCran
GetClusterClusterInfoSpecLibraryMaven
- Coordinates string
- Exclusions List<string>
- Repo string
- Coordinates string
- Exclusions []string
- Repo string
- coordinates String
- exclusions List<String>
- repo String
- coordinates string
- exclusions string[]
- repo string
- coordinates str
- exclusions Sequence[str]
- repo str
- coordinates String
- exclusions List<String>
- repo String
GetClusterClusterInfoSpecLibraryProviderConfig
- Workspace
Id string - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- Workspace
Id string - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace
Id String - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace
Id string - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace_
id str - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace
Id String - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
GetClusterClusterInfoSpecLibraryPypi
GetClusterClusterInfoSpecProviderConfig
- Workspace
Id string - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- Workspace
Id string - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace
Id String - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace
Id string - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace_
id str - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace
Id String - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
GetClusterClusterInfoSpecWorkloadType
GetClusterClusterInfoSpecWorkloadTypeClients
GetClusterClusterInfoTerminationReason
- Code string
- Parameters Dictionary<string, string>
- Type string
- Code string
- Parameters map[string]string
- Type string
- code String
- parameters Map<String,String>
- type String
- code string
- parameters {[key: string]: string}
- type string
- code str
- parameters Mapping[str, str]
- type str
- code String
- parameters Map<String>
- type String
GetClusterClusterInfoWorkloadType
GetClusterClusterInfoWorkloadTypeClients
GetClusterProviderConfig
- Workspace
Id string - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- Workspace
Id string - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace
Id String - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace
Id string - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace_
id str - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace
Id String - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
Package Details
- Repository
- databricks pulumi/pulumi-databricks
- License
- Apache-2.0
- Notes
- This Pulumi package is based on the
databricksTerraform Provider.
