# Databricks

Databricks is the lakehouse company, helping data teams solve the world's toughest problems with unified analytics platform for big data and AI.

- **Category:** developer tools
- **Auth:** OAUTH2, API_KEY
- **Composio Managed App Available?** No
- **Tools:** 423
- **Triggers:** 0
- **Slug:** `DATABRICKS`
- **Version:** 20260312_00

## Tools

### Add Member to Security Group

**Slug:** `DATABRICKS_ADD_MEMBER_TO_SECURITY_GROUP`

Tool to add a user or group as a member to a Databricks security group. Use when you need to grant group membership for access control.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `user_name` | string | No | The email address of a user to add to the parent group. Either user_name or group_name must be provided (not both). |
| `group_name` | string | No | The name of a group to add to the parent group as a nested member. Either user_name or group_name must be provided (not both). |
| `parent_name` | string | Yes | Name of the parent group to which the new member will be added. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Custom LLM Agent

**Slug:** `DATABRICKS_AGENTBRICKS_AGENT_BRICKS_DELETE_CUSTOM_LLM`

Tool to delete a Custom LLM agent created through Agent Bricks. Use when you need to remove a custom LLM and all associated data. This operation is irreversible and deletes all data including temporary transformations, model checkpoints, and internal metadata.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The ID of the custom LLM agent to delete |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Databricks App

**Slug:** `DATABRICKS_APPS_APPS_CREATE`

Tool to create a new Databricks app with specified configuration. Use when you need to create apps hosted on Databricks serverless platform to deploy secure data and AI applications. The app name must be unique within the workspace, contain only lowercase alphanumeric characters and hyphens, and cannot be changed after creation.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the app. Must contain only lowercase alphanumeric characters and hyphens. Must be unique within the workspace. Cannot be changed after creation. Do not include sensitive information in the app name |
| `resources` | array | No | App resources configuration including database connections, Genie Spaces, jobs, secrets, serving endpoints, SQL warehouses, and Unity Catalog securables |
| `description` | string | No | The description of the app providing additional context about its purpose |
| `permissions` | array | No | Access control list for app users specifying CAN_MANAGE or CAN_USE access levels applicable to users, groups, or service principals |
| `compute_size` | string ("MEDIUM" | "LARGE") | No | Compute resource allocation for the app. MEDIUM for moderate workloads, LARGE for intensive workloads |
| `user_api_scopes` | array | No | API access permissions for the app |
| `budget_policy_id` | string | No | Associated budget policy identifier for cost management |
| `source_code_path` | string | No | Workspace file system path of the source code used to create the app deployment. Must be a valid workspace path in the format /Workspace/Users/{user-path}/{app-directory} |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Databricks App

**Slug:** `DATABRICKS_APPS_APPS_DELETE`

Tool to delete a Databricks app from the workspace. Use when you need to remove an app and its associated service principal. When an app is deleted, Databricks automatically deletes the provisioned service principal.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the app to delete. Must contain only lowercase alphanumeric characters and hyphens, and must be unique within the workspace |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Deploy Databricks App

**Slug:** `DATABRICKS_APPS_APPS_DEPLOY`

Tool to create a deployment for a Databricks app. Use when you need to deploy an app with source code from a workspace path. The deployment process provisions compute resources and uploads the source code. Deployments can be in states: IN_PROGRESS, SUCCEEDED, FAILED, or CANCELLED.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `mode` | string ("AUTO_SYNC" | "SNAPSHOT") | No | The mode which determines how the deployment will manage the source code. AUTO_SYNC automatically synchronizes source code changes. SNAPSHOT creates a point-in-time snapshot of source code. If not specified, the API will use its default mode |
| `app_name` | string | Yes | The name of the app for which the deployment is created |
| `source_code_path` | string | Yes | Workspace file system path of the source code used to create the app deployment. Must be a valid workspace path in the format /Workspace/Users/{user-path}/{app-directory} |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Databricks App Details

**Slug:** `DATABRICKS_APPS_APPS_GET`

Tool to retrieve details about a specific Databricks app by name. Use when you need to get comprehensive information about an app including configuration, deployment status, compute resources, and metadata.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the app. Must contain only lowercase alphanumeric characters and hyphens, and must be unique within the workspace |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Databricks App Permission Levels

**Slug:** `DATABRICKS_APPS_APPS_GET_PERMISSION_LEVELS`

Tool to retrieve available permission levels for a Databricks app. Use when you need to understand what permission levels can be assigned to users or groups for a specific app. Returns permission levels like CAN_USE and CAN_MANAGE with their descriptions.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier (UUID) of the Databricks app |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Databricks App Permissions

**Slug:** `DATABRICKS_APPS_APPS_GET_PERMISSIONS`

Tool to retrieve permissions for a Databricks app. Use when you need to check who has access to an app and their permission levels. Returns the access control list including inherited permissions from parent or root objects.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the app. Must contain only lowercase alphanumeric characters and hyphens, and must be unique within the workspace |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get App Update Status

**Slug:** `DATABRICKS_APPS_APPS_GET_UPDATE`

Retrieves the current update status of a Databricks app. This endpoint returns whether the app's most recent configuration update succeeded, failed, is in progress, or has never been updated. Use this to monitor the status of app configuration changes (such as description, compute size, or resource modifications) rather than deployment status.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the app to retrieve update status for. Must contain only lowercase alphanumeric characters and hyphens |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Set Databricks App Permissions

**Slug:** `DATABRICKS_APPS_APPS_SET_PERMISSIONS`

Tool to set permissions for a Databricks app, replacing all existing permissions. Use when you need to configure access control for an app. This operation replaces ALL existing permissions; for incremental updates, use the update permissions endpoint instead. Admin permissions cannot be removed.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the app. Must contain only lowercase alphanumeric characters and hyphens, and must be unique within the workspace |
| `access_control_list` | array | Yes | Array of access control entries. This replaces ALL existing permissions. For each entry, only one of user_name, group_name, or service_principal_name should be specified |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Start Databricks App

**Slug:** `DATABRICKS_APPS_APPS_START`

Tool to start the last active deployment of a Databricks app. Use when you need to start a stopped app, which transitions it to the ACTIVE state. The start operation is asynchronous and the app will transition to the ACTIVE state after the operation completes.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the app to start. Must contain only lowercase alphanumeric characters and hyphens |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Stop Databricks App

**Slug:** `DATABRICKS_APPS_APPS_STOP`

Tool to stop the active deployment of a Databricks app. Use when you need to stop a running app, which transitions it to the STOPPED state. The stop operation is asynchronous and the app will transition to the STOPPED state after the operation completes.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the app to stop. Must contain only lowercase alphanumeric characters and hyphens |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Databricks App

**Slug:** `DATABRICKS_APPS_APPS_UPDATE`

Tool to update an existing Databricks app configuration. Use when you need to modify app settings such as description, resources, compute size, budget policy, or API scopes. This is a partial update operation - only fields provided in the request will be updated, other fields retain their current values.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the app to update. Must contain only lowercase alphanumeric characters and hyphens |
| `resources` | array | No | List of accessible system components and resources for the app. When provided, replaces the existing resources configuration |
| `description` | string | No | Application summary or description. When provided, replaces the existing description |
| `usage_policy_id` | string | No | Usage policy identifier for the application. When provided, replaces the existing usage policy |
| `user_api_scopes` | array | No | API access grants for application users. List of permission scopes. When provided, replaces the existing user API scopes |
| `budget_policy_id` | string | No | Associated budget policy identifier for cost management. When provided, replaces the existing budget policy |
| `default_source_code_path` | string | No | Workspace path tracking the last active deployment source code. When provided, replaces the existing source code path |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Databricks App Permissions

**Slug:** `DATABRICKS_APPS_APPS_UPDATE_PERMISSIONS`

Tool to incrementally update permissions for a Databricks app. Use when you need to modify specific permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request. For replacing all permissions, use SetPermissions instead.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The unique identifier or name of the app. Must contain only lowercase alphanumeric characters and hyphens |
| `access_control_list` | array | Yes | List of access control entries to update. This operation updates only the specified permissions incrementally, preserving existing permissions not included in the request. For each entry, exactly one of user_name, group_name, or service_principal_name must be specified |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Catalog Artifact Allowlist

**Slug:** `DATABRICKS_CATALOG_ARTIFACT_ALLOWLISTS_GET`

Tool to retrieve artifact allowlist configuration for a specified artifact type in Unity Catalog. Use when you need to check which artifacts are permitted for use in your Databricks environment. Requires metastore admin privileges or MANAGE ALLOWLIST privilege on the metastore.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `artifact_type` | string ("INIT_SCRIPT" | "LIBRARY_JAR" | "LIBRARY_MAVEN") | Yes | The artifact type of the allowlist to retrieve. Supported values are INIT_SCRIPT, LIBRARY_JAR, or LIBRARY_MAVEN |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Catalog

**Slug:** `DATABRICKS_CATALOG_CATALOGS_DELETE`

Tool to delete a catalog from Unity Catalog metastore. Use when you need to permanently remove a catalog and optionally its contents. By default, the catalog must be empty (except for information_schema). Use force=true to delete non-empty catalogs. Do not delete the main catalog as it can break existing data operations.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the catalog to delete. The caller must be a metastore admin or the owner of the catalog |
| `force` | boolean | No | Force deletion even if the catalog is not empty. When set to true, allows deletion of catalogs containing schemas and tables. When set to false or omitted, the catalog can only be deleted if it contains no schemas (except information_schema). Requires metastore owner/admin privileges and workspace admin privileges |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Catalog Details

**Slug:** `DATABRICKS_CATALOG_CATALOGS_GET`

Tool to retrieve details of a specific catalog in Unity Catalog. Use when you need to get information about a catalog including its metadata, owner, properties, and configuration. Requires metastore admin privileges, catalog ownership, or USE_CATALOG privilege.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the catalog to retrieve |
| `include_browse` | boolean | No | Whether to include catalogs in the response for which the principal can only access selective metadata through the BROWSE privilege |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Catalog Connection

**Slug:** `DATABRICKS_CATALOG_CONNECTIONS_CREATE`

Tool to create a new Unity Catalog connection to external data sources. Use when you need to establish connections to databases and services such as MySQL, PostgreSQL, Snowflake, etc. Requires metastore admin privileges or CREATE CONNECTION privilege on the metastore.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | Name of the connection. This is the unique identifier for the connection within the metastore |
| `comment` | string | No | User-provided free-form text description of the connection |
| `options` | object | Yes | A map of key-value properties containing connection-type specific parameters needed to establish the connection. Property keys must be unique and are case-sensitive. Required options vary by connection type: HTTP requires 'host' and 'bearer_token'; MYSQL requires 'host', 'port', 'user', and 'password'; POSTGRESQL requires 'host', 'port', 'database', 'user', and 'password'; SNOWFLAKE requires 'host', 'user', and 'password'. Refer to Databricks documentation for complete requirements per connection type |
| `read_only` | boolean | No | If true, the connection is read-only and cannot be used for write operations |
| `properties` | object | No | A map of key-value properties attached to the securable object for custom metadata |
| `connection_type` | string ("BIGQUERY" | "DATABRICKS" | "GA4_RAW_DATA" | "GLUE" | "HIVE_METASTORE" | "HTTP" | "MYSQL" | "ORACLE" | "POSTGRESQL" | "POWER_BI" | "REDSHIFT" | "SALESFORCE" | "SALESFORCE_DATA_CLOUD" | "SERVICENOW" | "SNOWFLAKE" | "SQLDW" | "SQLSERVER" | "TERADATA" | "WORKDAY_RAAS") | Yes | The type of connection to create. Must be one of the supported connection types |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Catalog Connection

**Slug:** `DATABRICKS_CATALOG_CONNECTIONS_DELETE`

Tool to delete a Unity Catalog connection to external data sources. Use when you need to remove connections to databases and services. Deleting a connection removes the abstraction used to connect from Databricks Compute to external data sources.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the connection to delete. This is the unique identifier for the connection resource |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Catalog Connection

**Slug:** `DATABRICKS_CATALOG_CONNECTIONS_GET`

Tool to retrieve detailed information about a specific Unity Catalog connection. Use when you need to get connection metadata, configuration, and properties for external data source connections.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | Name of the connection to retrieve. This is the unique identifier for the connection within the metastore |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Catalog Connection

**Slug:** `DATABRICKS_CATALOG_CONNECTIONS_UPDATE`

Tool to update an existing Unity Catalog connection configuration. Use when you need to modify connection properties, credentials, ownership, or metadata for external data sources.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `owner` | string | No | Username of the new owner of the connection |
| `comment` | string | No | User-provided free-form text description of the connection |
| `options` | object | Yes | A map of key-value properties containing connection-specific configuration. The exact keys depend on the connection type (e.g., for MySQL connections: host, port, user, password) |
| `new_name` | string | No | New name for the connection if renaming is desired |
| `name_or_id` | string | Yes | Name or unique identifier of the connection to update |
| `properties` | object | No | Additional key-value properties attached to the connection |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Catalog Credential

**Slug:** `DATABRICKS_CATALOG_CREDENTIALS_CREATE_CREDENTIAL`

Tool to create a new credential for Unity Catalog access to cloud services. Use when you need to establish authentication for STORAGE (cloud storage) or SERVICE (external services like AWS Glue) purposes. Requires metastore admin or CREATE_STORAGE_CREDENTIAL/CREATE_SERVICE_CREDENTIAL privileges. Exactly one cloud credential type must be provided.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The credential name. Must be unique among storage and service credentials within the metastore |
| `comment` | string | No | Comment associated with the credential |
| `purpose` | string ("SERVICE" | "STORAGE") | Yes | Indicates the purpose of the credential. Valid values are SERVICE (for accessing external cloud services) or STORAGE (for accessing cloud storage locations) |
| `read_only` | boolean | No | Whether the credential is usable only for read operations. Only applicable when purpose is STORAGE |
| `aws_iam_role` | object | No | AWS IAM role configuration for credential. |
| `skip_validation` | boolean | No | Supplying true to this argument skips validation of the created credential |
| `azure_managed_identity` | object | No | Azure managed identity configuration for credential. |
| `azure_service_principal` | object | No | Azure service principal configuration for credential. |
| `databricks_gcp_service_account` | object | No | GCP service account configuration for credential. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Catalog Credential

**Slug:** `DATABRICKS_CATALOG_CREDENTIALS_DELETE_CREDENTIAL`

Tool to delete a Unity Catalog credential for cloud storage or service access. Use when you need to remove credentials that authenticate access to cloud resources. By default, deletion will fail if the credential has dependent resources. Use force=true to delete credentials with dependencies.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `force` | boolean | No | Force deletion even if there are dependent external locations or external tables (when purpose is STORAGE) or dependent services (when purpose is SERVICE). Default is false |
| `credential_name_or_id` | string | Yes | The name or ID of the credential to delete. This is the unique identifier for the credential resource |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Catalog Credential

**Slug:** `DATABRICKS_CATALOG_CREDENTIALS_GET_CREDENTIAL`

Tool to retrieve detailed information about a specific Unity Catalog credential. Use when you need to get credential metadata, configuration, and cloud provider details for storage or service credentials.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | Name of the credential to retrieve. This is the unique identifier for the credential within the metastore |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Catalog Credential

**Slug:** `DATABRICKS_CATALOG_CREDENTIALS_UPDATE_CREDENTIAL`

Tool to update an existing Unity Catalog credential with new properties. Use when you need to modify credential configuration, ownership, or cloud provider settings. The caller must be the owner of the credential, a metastore admin, or have MANAGE permission on the credential. If the caller is a metastore admin, only the owner field can be changed.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `force` | boolean | No | Force update even if there are dependent external locations or external tables |
| `owner` | string | No | Username of the new owner of the credential. If the caller is a metastore admin, only this field can be changed |
| `comment` | string | No | Comment associated with the credential |
| `new_name` | string | No | New name for the credential |
| `read_only` | boolean | No | Whether the storage credential is only usable for read operations. Only applicable when purpose is STORAGE |
| `name_or_id` | string | Yes | The name or ID of the credential to update. This is the unique identifier for the credential resource |
| `aws_iam_role` | object | No | AWS IAM role configuration for credential. |
| `isolation_mode` | string ("ISOLATION_MODE_ISOLATED" | "ISOLATION_MODE_OPEN") | No | Whether the credential is accessible from all workspaces or a specific set of workspaces |
| `skip_validation` | boolean | No | Supplying true skips validation of the updated credential |
| `cloudflare_api_token` | object | No | Cloudflare API token configuration for credential. |
| `azure_managed_identity` | object | No | Azure managed identity configuration for credential. |
| `azure_service_principal` | object | No | Azure service principal configuration for credential. |
| `databricks_gcp_service_account` | object | No | GCP service account configuration for credential. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Generate Temporary Service Credential

**Slug:** `DATABRICKS_CATALOG_CREDS_GENERATE_TEMP_SERVICE_CRED`

Tool to generate temporary credentials from a service credential with admin access. Use when you need short-lived, scoped credentials for accessing cloud resources. The caller must be a metastore admin or have the ACCESS privilege on the service credential.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `gcp_options` | object | No | GCP-specific configuration options for temporary credential generation. |
| `azure_options` | object | No | Azure-specific configuration options for temporary credential generation. |
| `credential_name` | string | Yes | The name of the service credential used to generate a temporary credential. The caller must be a metastore admin or have the metastore privilege ACCESS on the service credential |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Validate Catalog Credential

**Slug:** `DATABRICKS_CATALOG_CREDS_VALIDATE_CRED`

Tool to validate a Unity Catalog credential for external access. Use when you need to verify that a credential can successfully perform its intended operations. For SERVICE credentials, validates cloud service access. For STORAGE credentials, tests READ, WRITE, DELETE, LIST operations on the specified location.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `url` | string | No | The external location URL to validate. Only applicable when purpose is STORAGE. If both url and external_location_name are provided, url takes precedence |
| `purpose` | string ("SERVICE" | "STORAGE") | No | The purpose of the credential. Use SERVICE for accessing external cloud services or STORAGE for cloud storage locations. Determines which validation logic is applied |
| `read_only` | boolean | No | Whether the credential should be validated for read-only access. Only applicable when purpose is STORAGE |
| `aws_iam_role` | object | No | AWS IAM role configuration for credential validation. |
| `credential_name` | string | No | The name of an existing credential to validate. Either this or a cloud-specific credential must be provided |
| `cloudflare_api_token` | object | No | Cloudflare API token configuration for credential validation. |
| `azure_managed_identity` | object | No | Azure managed identity configuration for credential validation. |
| `external_location_name` | string | No | The name of an existing external location to validate. Only applicable when purpose is STORAGE. Either external_location_name or url must be provided for storage credentials |
| `azure_service_principal` | object | No | Azure service principal configuration for credential validation. |
| `databricks_gcp_service_account` | object | No | GCP service account configuration for credential validation. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Entity Tag Assignment

**Slug:** `DATABRICKS_CATALOG_ENTITY_TAG_ASSIGNMENTS_GET`

Tool to retrieve a specific tag assignment for a Unity Catalog entity by tag key. Use when you need to get details about a tag assigned to catalogs, schemas, tables, columns, or volumes. Requires USE CATALOG and USE SCHEMA permissions on parent resources, and ASSIGN or MANAGE permissions on the tag policy for governed tags.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `tag_key` | string | Yes | The key of the tag to retrieve |
| `entity_name` | string | Yes | The fully qualified name of the entity to which the tag is assigned (e.g., 'samples.nyctaxi.trips' for a table) |
| `entity_type` | string ("catalogs" | "schemas" | "tables" | "columns" | "volumes") | Yes | The type of the entity to which the tag is assigned. Must be one of: catalogs, schemas, tables, columns, volumes |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create External Location

**Slug:** `DATABRICKS_CATALOG_EXTERNAL_LOCATIONS_CREATE`

Tool to create a new Unity Catalog external location combining a cloud storage path with a storage credential. Use when you need to establish access to cloud storage in Azure Data Lake Storage, AWS S3, or Cloudflare R2. Requires metastore admin or CREATE_EXTERNAL_LOCATION privilege on both the metastore and the associated storage credential.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `url` | string | Yes | Path URL of the external location (e.g., s3://bucket-name/path, abfss://container@storage.dfs.core.windows.net/path, gs://bucket-name/path) |
| `name` | string | Yes | Name of the external location. This is the identifier used to reference the location |
| `comment` | string | No | User-provided free-form text description of the external location |
| `fallback` | boolean | No | Allows fallback to cluster credentials when Unity Catalog credentials are insufficient. Enables support for legacy workloads |
| `read_only` | boolean | No | Indicates whether the external location is read-only. When true, restricts the location to read-only access. Default is false |
| `credential_name` | string | Yes | Name of the storage credential to use with this location. The credential authorizes access to the cloud storage path |
| `skip_validation` | boolean | No | Skips validation of the storage credential associated with the external location. Default is false |
| `file_event_queue` | object | No | File event queue configuration. |
| `enable_file_events` | boolean | No | Whether to enable file event tracking at this location. If enabled, file_event_queue must be provided |
| `encryption_details` | object | No | Server-side encryption configuration for AWS S3 communication. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete External Location

**Slug:** `DATABRICKS_CATALOG_EXTERNAL_LOCATIONS_DELETE`

Tool to delete an external location from Unity Catalog metastore. Use when you need to remove an external location that combines a cloud storage path with a storage credential. The caller must be the owner of the external location. Use force=true to delete even if there are dependent external tables or mounts.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the external location to delete. External location names are unqualified and must be unique within the metastore. The caller must be the owner of the external location to perform this operation |
| `force` | boolean | No | Force deletion even if there are dependent external tables or mounts. When set to true, the operation will proceed despite any dependencies. When false or omitted, the deletion will fail if dependencies exist. Default: false |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get External Location Details

**Slug:** `DATABRICKS_CATALOG_EXTERNAL_LOCATIONS_GET`

Tool to retrieve details of a specific Unity Catalog external location. Use when you need to get information about an external location including its URL, storage credential, and configuration. Requires metastore admin privileges, external location ownership, or appropriate privileges on the external location.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | Name or identifier of the external location to retrieve |
| `include_browse` | boolean | No | Whether to include external locations in the response for which the principal can only access selective metadata through the BROWSE privilege |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update External Location

**Slug:** `DATABRICKS_CATALOG_EXTERNAL_LOCATIONS_UPDATE`

Tool to update an existing Unity Catalog external location properties. Use when you need to modify the cloud storage path, credentials, ownership, or configuration of an external location. The caller must be the owner of the external location or a metastore admin. Use force parameter to update even if URL changes invalidate dependencies.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `url` | string | No | Cloud storage path URL (e.g., s3://bucket-name/path, abfss://container@storage.dfs.core.windows.net/path, gs://bucket-name/path) |
| `name` | string | Yes | Name or identifier of the external location to update |
| `force` | boolean | No | Force update even if changing URL invalidates dependent external tables or mounts. Use with caution as this may break existing dependencies |
| `owner` | string | No | Username, group name, or service principal name/ID for ownership assignment |
| `comment` | string | No | User-provided free-form text description |
| `fallback` | boolean | No | Enable fallback to cluster credentials for access when Unity Catalog credentials are insufficient |
| `new_name` | string | No | New name for the external location if renaming is desired |
| `read_only` | boolean | No | Whether the external location is read-only. When true, restricts write access to the location |
| `isolation_mode` | string | No | Isolation configuration. Use 'ISOLATION_MODE_ISOLATED' for isolated mode or 'ISOLATION_MODE_OPEN' for open mode |
| `credential_name` | string | No | Name of the storage credential to use with this location |
| `skip_validation` | boolean | No | Suppress validation errors and force save the external location. Bypasses storage credential validation during update |
| `file_event_queue` | object | No | File event queue configuration. |
| `enable_file_events` | boolean | No | Activate managed file events for this location. If enabled, file_event_queue must be provided |
| `encryption_details` | object | No | Server-side encryption configuration for AWS S3 communication. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update External Metadata

**Slug:** `DATABRICKS_CATALOG_EXTERNAL_METADATA_UPDATE_EXTERNAL`

Tool to update an external metadata object in Unity Catalog. Use when you need to modify metadata about external systems registered within Unity Catalog. The user must have metastore admin status, own the object, or possess the MODIFY privilege. Note that changing ownership requires the MANAGE privilege, and callers cannot update both the owner and other metadata in a single request.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | Unique identifier of the external metadata object to be modified |
| `url` | string | No | URL associated with the external metadata object |
| `name` | string | No | Name of the external metadata object |
| `owner` | string | No | Owner of the external metadata object. Cannot be updated with other fields in the same request |
| `columns` | array | No | List of columns associated with the external metadata object |
| `properties` | object | No | Key-value properties attached to the external metadata object |
| `description` | string | No | User-provided free-form text description |
| `entity_type` | string | No | Type of entity within the external system |
| `system_type` | string | No | Type of external system (e.g., DATABRICKS, SNOWFLAKE, BIGQUERY) |
| `update_mask` | string | Yes | A comma-separated list of field names to update (e.g., 'name', 'description', 'properties'). The field path is relative to the resource object, using a dot (.) to navigate sub-fields (e.g., author.given_name). A wildcard (*) signals complete replacement, though explicit field listing is recommended to prevent unintended changes if the API evolves |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Catalog Function

**Slug:** `DATABRICKS_CATALOG_FUNCTIONS_UPDATE`

Tool to update function owner in Unity Catalog. Use when you need to change the ownership of a catalog function. Only the owner of the function can be updated via this endpoint. The caller must be a metastore admin, the owner of the function's parent catalog, the owner of the parent schema with USE_CATALOG privilege, or the owner of the function with both USE_CATALOG and USE_SCHEMA privileges.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The fully-qualified name of the function in the form catalog_name.schema_name.function_name |
| `owner` | string | No | Username of the new owner of the function. This is the only field that can be updated via this endpoint |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Catalog Grants

**Slug:** `DATABRICKS_CATALOG_GRANTS_GET`

Tool to get permissions (grants) for a securable in Unity Catalog without inherited permissions. Use when you need to see direct privilege assignments on a catalog or other securable object. Returns only privileges directly assigned to principals, excluding inherited permissions from parent securables. For inherited permissions, use the get-effective endpoint instead.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `full_name` | string | Yes | Full name of the securable. For catalog: 'my_catalog', for table: 'catalog.schema.table' |
| `principal` | string | No | If provided, only the permissions for the specified principal (user email, group name, or service principal) are returned |
| `page_token` | string | No | Opaque token for retrieving the next page of results |
| `max_results` | integer | No | Maximum number of privileges to return per page. Must be 0 or >= 150. Values between 1-149 are invalid. Default: 0 (server-determined page size). Using pagination is recommended as unpaginated calls will be deprecated |
| `securable_type` | string | Yes | Type of securable object. For catalogs, use 'catalog'. Valid values: catalog, schema, table, metastore, volume, function, model, connection, share, recipient, provider, storage_credential, external_location |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Effective Catalog Permissions

**Slug:** `DATABRICKS_CATALOG_GRANTS_GET_EFFECTIVE`

Tool to get effective permissions for a securable in Unity Catalog, including inherited permissions from parent securables. Use when you need to understand what privileges are granted to principals through direct assignments or inheritance. Returns privileges conveyed to each principal through the Unity Catalog hierarchy (metastore → catalog → schema → table/view/volume).

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `full_name` | string | Yes | Full name of the securable. For catalog: 'my_catalog', for table: 'catalog.schema.table' |
| `principal` | string | No | If provided, only the effective permissions for the specified principal (user email or group name) are returned |
| `page_token` | string | No | Opaque token for retrieving the next page of results |
| `max_results` | integer | No | Maximum number of privileges to return per page. Set to 0 to use server default. Valid values >= 150. Using pagination is recommended as unpaginated calls will be deprecated |
| `securable_type` | string | Yes | Type of securable object. Valid values: CATALOG, SCHEMA, TABLE, METASTORE, VOLUME, FUNCTION, MODEL, CONNECTION, SHARE, RECIPIENT, PROVIDER, STORAGE_CREDENTIAL, EXTERNAL_LOCATION |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Catalog Grants

**Slug:** `DATABRICKS_CATALOG_GRANTS_UPDATE`

Tool to update permissions for Unity Catalog securables by adding or removing privileges for principals. Use when you need to grant or revoke permissions on catalogs, schemas, tables, or other Unity Catalog objects. Only metastore admins, object owners, users with MANAGE privilege, or parent catalog/schema owners can update permissions.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `changes` | array | Yes | Array of permission changes to apply. Each change specifies a principal and which privileges to add or remove for that principal |
| `full_name` | string | Yes | Full name of the securable. For catalog: 'my_catalog', for schema: 'catalog.schema', for table: 'catalog.schema.table' |
| `securable_type` | string | Yes | Type of securable object. Valid values: CATALOG, SCHEMA, TABLE, VIEW, MATERIALIZED_VIEW, VOLUME, FUNCTION, PROCEDURE, METASTORE, CONNECTION, CLEAN_ROOM, EXTERNAL_LOCATION, EXTERNAL_METADATA, STORAGE_CREDENTIAL, SERVICE_CREDENTIAL, SHARE |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Assign Metastore to Workspace

**Slug:** `DATABRICKS_CATALOG_METASTORES_ASSIGN`

Tool to assign a Unity Catalog metastore to a workspace. Use when you need to link a workspace to a Unity Catalog metastore, enabling shared data access with consistent governance policies. Requires account admin privileges. If an assignment for the same workspace_id exists, it will be overwritten by the new metastore_id and default_catalog_name.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `metastore_id` | string | Yes | The unique ID of the metastore to assign. Unique identifier of the parent metastore to assign to the workspace |
| `workspace_id` | integer | Yes | The unique workspace identifier. The ID of the workspace to assign the metastore to |
| `default_catalog_name` | string | No | The name of the default catalog in the metastore. DEPRECATED: Use the Default Namespace API (databricks.DefaultNamespaceSetting) to configure the default catalog for a Databricks workspace instead |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Metastore

**Slug:** `DATABRICKS_CATALOG_METASTORES_CREATE`

Tool to create a new Unity Catalog metastore. Use when you need to establish a top-level container for data in Unity Catalog, registering metadata about securable objects (tables, volumes, external locations, shares) and access permissions. Requires account admin privileges. By default, the owner is the user calling the API; setting owner to empty string assigns ownership to System User.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The user-specified name of the metastore. Must be unique within the region |
| `cloud` | string | No | Cloud provider identifier |
| `owner` | string | No | Username/groupname/service principal application_id of the metastore owner. Empty string assigns to System User |
| `region` | string | No | Cloud region which the metastore serves (e.g., us-west-2, westus) |
| `storage_root` | string | No | The storage root URL for metastore (supports S3, Azure Blob, GCS paths) |
| `delta_sharing_scope` | string ("INTERNAL" | "INTERNAL_AND_EXTERNAL") | No | Scope of Delta Sharing enabled for the metastore |
| `storage_root_credential_id` | string | No | UUID of the storage credential used to access the metastore storage root |
| `default_data_access_config_id` | string | No | Default data access configuration ID |
| `delta_sharing_organization_name` | string | No | Organization name of a Delta Sharing entity for Databricks-to-Databricks Delta Sharing. Cannot be removed once set |
| `delta_sharing_recipient_token_lifetime_in_seconds` | integer | No | Lifetime of delta sharing recipient token in seconds. Set to 0 for unlimited |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Current Metastore Assignment

**Slug:** `DATABRICKS_CATALOG_METASTORES_CURRENT`

Tool to retrieve the current metastore assignment for the workspace being accessed. Use when you need to determine which metastore is assigned to the current workspace context.

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Metastore

**Slug:** `DATABRICKS_CATALOG_METASTORES_DELETE`

Tool to delete a Unity Catalog metastore. Use when you need to permanently remove a metastore and its managed data. Before deletion, you must delete or unlink any workspaces using the metastore. All objects managed by the metastore will become inaccessible. Requires metastore admin privileges.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | Unique identifier of the metastore to delete. The caller must be a metastore admin |
| `force` | boolean | No | Force deletion even if the metastore is not empty. When set to true, allows deletion of metastores containing data assets like tables and views, bypassing the default safety check. Default is false |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Metastore Details

**Slug:** `DATABRICKS_CATALOG_METASTORES_GET`

Retrieves comprehensive details about a Unity Catalog metastore by its unique ID. Returns metastore configuration including name, cloud provider, region, owner, storage settings, Delta Sharing configuration, privilege model version, and audit metadata (creation/update timestamps and users). Use this to inspect metastore properties, verify configurations, or gather information for metastore management operations. Note: Requires appropriate metastore access permissions.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique UUID of the metastore to retrieve. Use DATABRICKS_CATALOG_METASTORES_CURRENT to get the current workspace's metastore ID if needed. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Metastore Summary

**Slug:** `DATABRICKS_CATALOG_METASTORES_SUMMARY`

Tool to retrieve summary information about the metastore associated with the current workspace. Use when you need metastore configuration overview including cloud vendor, region, storage, and Delta Sharing details.

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Unassign Metastore from Workspace

**Slug:** `DATABRICKS_CATALOG_METASTORES_UNASSIGN`

Tool to unassign a Unity Catalog metastore from a workspace. Use when you need to remove the association between a workspace and its assigned metastore, leaving the workspace with no metastore. The metastore itself is not deleted, only the workspace assignment is removed. Requires account admin privileges.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `metastore_id` | string | Yes | The unique identifier of the Unity Catalog metastore to unassign from the workspace. This identifies which metastore assignment to remove |
| `workspace_id` | integer | Yes | The unique identifier of the workspace from which to unassign the metastore. This is the ID of the workspace to remove the metastore assignment from |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Metastore

**Slug:** `DATABRICKS_CATALOG_METASTORES_UPDATE`

Tool to update configuration settings for an existing Unity Catalog metastore. Use when you need to modify metastore properties like name, owner, Delta Sharing settings, or storage credentials. Requires metastore admin permissions.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | Unique identifier of the metastore to update |
| `owner` | string | No | The owner of the metastore. If set to empty string, ownership is updated to the System User |
| `new_name` | string | No | New name for the metastore |
| `delta_sharing_scope` | string ("INTERNAL" | "INTERNAL_AND_EXTERNAL") | No | The scope of Delta Sharing enabled for the metastore. INTERNAL allows sharing within the same organization only, INTERNAL_AND_EXTERNAL allows sharing with external organizations |
| `external_access_enabled` | boolean | No | Whether to enable external data access on the metastore. When true, allows the metastore to access external data sources |
| `privilege_model_version` | string | No | Privilege model version of the metastore, of the form major.minor (e.g., 1.0) |
| `storage_root_credential_id` | string | No | UUID of storage credential to access the metastore storage_root |
| `delta_sharing_organization_name` | string | No | The organization name of a Delta Sharing entity, to be used in Databricks-to-Databricks Delta Sharing as the official name |
| `delta_sharing_recipient_token_lifetime_in_seconds` | integer | No | The lifetime of delta sharing recipient token in seconds |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Metastore Assignment

**Slug:** `DATABRICKS_CATALOG_METASTORES_UPDATE_ASSIGNMENT`

Tool to update a metastore assignment for a workspace. Use when you need to update the metastore_id or default_catalog_name for a workspace that already has a metastore assigned. Account admin privileges are required to update metastore_id, while workspace admin can update default_catalog_name.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `metastore_id` | string | No | The unique ID of the metastore to assign. Updates the metastore assigned to the workspace. The caller must be an account admin to update this field |
| `workspace_id` | integer | Yes | The unique workspace identifier. A workspace ID to update the metastore assignment for |
| `default_catalog_name` | string | No | The name of the default catalog in the metastore. DEPRECATED: Use the Default Namespace API to configure the default catalog for a Databricks workspace instead |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Model Version

**Slug:** `DATABRICKS_CATALOG_MODEL_VERSIONS_GET`

Tool to retrieve detailed information about a specific version of a registered model in Unity Catalog. Use when you need to get metadata, status, source location, and configuration of a model version. Requires metastore admin privileges, model ownership, or EXECUTE privilege on the registered model with appropriate catalog and schema privileges.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `version` | integer | Yes | The integer version number of the model version to retrieve |
| `full_name` | string | Yes | The three-level (fully qualified) name of the model (format: catalog.schema.model_name) |
| `include_browse` | boolean | No | Whether to include model versions for which the principal can only access selective metadata |
| `include_aliases` | boolean | No | Whether to include aliases associated with the model version |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Model Version

**Slug:** `DATABRICKS_CATALOG_MODEL_VERSIONS_UPDATE`

Tool to update a Unity Catalog model version. Use when you need to modify the comment of a specific model version. Currently only the comment field can be updated. The caller must be a metastore admin or owner of the parent registered model with appropriate catalog and schema privileges.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `comment` | string | Yes | The comment to attach to the model version. This is currently the only field that can be updated. Provide the new comment value you want to set |
| `version` | integer | Yes | The integer version number of the model version to update |
| `full_name` | string | Yes | The three-level (fully qualified) name of the model (e.g., 'catalog.schema.model_name') |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Online Table

**Slug:** `DATABRICKS_CATALOG_ONLINE_TABLES_DELETE`

Tool to delete an online table by name. Use when you need to permanently remove an online table and stop data synchronization. This operation deletes all data in the online table permanently and releases all resources. Note: online tables are deprecated and will not be accessible after January 15, 2026.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | Full three-part (catalog, schema, table) name of the online table to delete. Format: 'catalog.schema.table_name'. The user must be the owner of both the offline table and online table to perform this operation |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Quality Monitor

**Slug:** `DATABRICKS_CATALOG_QUALITY_MONITORS_GET`

Tool to retrieve quality monitor configuration for a Unity Catalog table. Use when you need to get monitor status, metrics tables, custom metrics, notifications, scheduling, and monitoring configuration details. Requires catalog and schema privileges plus SELECT on the table.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `table_name` | string | Yes | Unity Catalog table name in format 'catalog.schema.table_name'. Case insensitive, spaces disallowed |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Quality Monitor Refreshes

**Slug:** `DATABRICKS_CATALOG_QUALITY_MONITORS_LIST_REFRESHES`

Tool to retrieve the refresh history for a quality monitor on a Unity Catalog table. Use when you need to check the status and history of monitor refresh operations. Returns up to 25 most recent refreshes including their state, timing, and status messages.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `table_name` | string | Yes | Full name of the table in Unity Catalog format (catalog.schema.table_name). This is the table that has a quality monitor attached. Example: 'samples.nyctaxi.trips' |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Registered Model

**Slug:** `DATABRICKS_CATALOG_REGISTERED_MODELS_GET`

Tool to retrieve detailed information about a registered model in Unity Catalog. Use when you need to get metadata, owner, storage location, and configuration of a registered model. Requires metastore admin privileges, model ownership, or EXECUTE privilege on the registered model with appropriate catalog and schema privileges.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `full_name` | string | Yes | The three-level (fully qualified) name of the registered model in the format catalog.schema.model_name |
| `include_browse` | boolean | No | Whether to include models where the principal has selective metadata access through the BROWSE privilege |
| `include_aliases` | boolean | No | Whether to include registered model aliases in the response |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Resource Quota Information

**Slug:** `DATABRICKS_CATALOG_RESOURCE_QUOTAS_GET_QUOTA`

Tool to retrieve usage information for a Unity Catalog resource quota defined by a child-parent pair. Use when you need to check quota usage for a specific resource type (tables per metastore, schemas per catalog, etc.). The API also triggers an asynchronous refresh if the count is out of date. Requires account admin authentication with OAuth.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `quota_name` | string | Yes | Name of the quota. Follows the pattern of the quota type, with '-quota' added as a suffix. The child object (table, schema, share, etc.) suffixed by '-quota'. |
| `parent_full_name` | string | Yes | Full name of the parent resource. For example, the 'main.default' schema or 'samples' catalog. If the parent is a metastore, use the metastore ID. |
| `parent_securable_type` | string ("CATALOG" | "SCHEMA" | "METASTORE") | Yes | Securable type of the quota parent. Type of the parent object. Valid values include: CATALOG, SCHEMA, METASTORE. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Batch Create Access Requests

**Slug:** `DATABRICKS_CATALOG_RFA_BATCH_CREATE_ACCESS_REQUESTS`

Tool to batch create access requests for Unity Catalog permissions. Use when you need to request access to catalogs, schemas, tables, or other Unity Catalog securables. Maximum 30 requests per API call, and maximum 30 securables per principal per call.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `requests` | array | Yes | List of individual access requests. Maximum 30 requests per API call |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Access Request Destinations

**Slug:** `DATABRICKS_CATALOG_RFA_GET_ACCESS_REQUEST_DESTS`

Tool to retrieve access request destinations for a Unity Catalog securable. Use when you need to find where notifications are sent when users request access to catalogs, schemas, tables, or other securables. Any caller can see URL destinations or destinations on the metastore. For other securables, only those with BROWSE permissions can see destinations.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `full_name` | string | Yes | The complete identifier path of the securable resource. For a catalog, this is the catalog name (e.g., 'samples'). For a schema, it's 'catalog_name.schema_name'. For a table, it's 'catalog_name.schema_name.table_name' |
| `securable_type` | string ("metastore" | "catalog" | "schema" | "table" | "external_location" | "connection" | "credential" | "function" | "registered_model" | "volume") | Yes | The type of the securable object. Supported values: metastore, catalog, schema, table, external_location, connection, credential, function, registered_model, volume |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Access Request Destinations

**Slug:** `DATABRICKS_CATALOG_RFA_UPDATE_ACCESS_REQUEST_DESTS`

Tool to update access request notification destinations for Unity Catalog securables. Use when you need to configure where access request notifications are sent for catalogs, schemas, external locations, connections, or credentials. Requires metastore admin, owner privileges, or MANAGE permission on the securable. Maximum 5 emails and 5 external destinations allowed per securable. Note: Destinations cannot be updated for securables underneath schemas (tables, volumes, functions, models) as they inherit from parent securables.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `securable` | object | Yes | The securable object for which to update access request destinations |
| `update_mask` | string | No | Field mask indicating which fields to update. Use 'destinations' to update the destinations list |
| `destinations` | array | Yes | The list of notification destinations for access requests. Maximum 5 emails and 5 external notification destinations. If a URL destination is assigned, no other destinations can be set |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Catalog Schema

**Slug:** `DATABRICKS_CATALOG_SCHEMAS_GET`

Tool to retrieve details of a specific schema from Unity Catalog metastore. Use when you need to get schema metadata, ownership, storage configuration, and properties. Requires metastore admin privileges, schema ownership, or USE_SCHEMA privilege.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `full_name` | string | Yes | Full name of the schema in the format 'catalog_name.schema_name' (e.g., 'main.analytics' or 'samples.nyctaxi') |
| `include_browse` | boolean | No | Whether to include schemas in the response for which the principal can only access selective metadata via BROWSE privilege |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Storage Credential

**Slug:** `DATABRICKS_CATALOG_STORAGE_CREDENTIALS_CREATE`

Tool to create a new storage credential in Unity Catalog for cloud data access. Use when you need to establish authentication for accessing cloud storage paths. Requires metastore admin or CREATE_STORAGE_CREDENTIAL privilege on the metastore. Exactly one cloud credential type must be provided.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The credential name. Must be unique among storage and service credentials within the metastore |
| `comment` | string | No | Comment associated with the credential |
| `read_only` | boolean | No | Whether the credential is usable only for read operations. Only applicable when purpose is STORAGE |
| `aws_iam_role` | object | No | AWS IAM role configuration for storage credential. |
| `skip_validation` | boolean | No | Supplying true skips validation of the created credential |
| `cloudflare_api_token` | object | No | Cloudflare API token configuration for storage credential. |
| `azure_managed_identity` | object | No | Azure managed identity configuration for storage credential. |
| `azure_service_principal` | object | No | Azure service principal configuration for storage credential. |
| `databricks_gcp_service_account` | object | No | GCP service account configuration for storage credential. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Storage Credential

**Slug:** `DATABRICKS_CATALOG_STORAGE_CREDENTIALS_DELETE`

Tool to delete a storage credential from the Unity Catalog metastore. Use when you need to remove storage credentials that provide authentication to cloud storage. The caller must be the owner of the storage credential. Use force=true to delete even if there are dependent external locations, tables, or services.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the storage credential to delete. The caller must be the owner of the storage credential to perform this operation |
| `force` | boolean | No | Force deletion even if there are dependent external locations or external tables (when purpose is STORAGE) or dependent services (when purpose is SERVICE). When set to true, the operation will proceed despite any dependencies. When false or omitted, the deletion will fail if dependencies exist. Default: false |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Storage Credential

**Slug:** `DATABRICKS_CATALOG_STORAGE_CREDENTIALS_GET`

Tool to retrieve storage credential details from Unity Catalog metastore by name. Use when you need to get information about a storage credential's configuration and properties. Requires metastore admin privileges, credential ownership, or appropriate permissions on the storage credential.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | Name of the storage credential to retrieve. Must be unique within the metastore |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Storage Credential

**Slug:** `DATABRICKS_CATALOG_STORAGE_CREDENTIALS_UPDATE`

Tool to update an existing storage credential in Unity Catalog. Use when you need to modify credential properties, cloud provider configuration, or ownership. The caller must be the owner of the storage credential or a metastore admin. Metastore admins can only modify the owner field.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the storage credential to update. This is used in the path parameter to identify the credential |
| `force` | boolean | No | Force update even if there are dependent external locations, external tables (when purpose is STORAGE), or dependent services (when purpose is SERVICE) |
| `owner` | string | No | Username, group name, or service principal application_id of the storage credential owner. Metastore admins can only change this field |
| `comment` | string | No | Comment associated with the credential |
| `new_name` | string | No | New name for the storage credential if renaming is desired |
| `read_only` | boolean | No | Whether the credential is usable only for read operations. Only applicable when purpose is STORAGE |
| `aws_iam_role` | object | No | AWS IAM role configuration for storage credential. |
| `isolation_mode` | string | No | Whether the current securable is accessible from all workspaces or a specific set. Use 'ISOLATION_MODE_ISOLATED' or 'ISOLATION_MODE_OPEN' |
| `skip_validation` | boolean | No | Skip validation of the updated credential |
| `cloudflare_api_token` | object | No | Cloudflare API token configuration for storage credential. |
| `azure_managed_identity` | object | No | Azure managed identity configuration for storage credential. |
| `azure_service_principal` | object | No | Azure service principal configuration for storage credential. |
| `databricks_gcp_service_account` | object | No | GCP service account configuration for storage credential. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Validate Storage Credential

**Slug:** `DATABRICKS_CATALOG_STORAGE_CREDENTIALS_VALIDATE`

Tool to validate a storage credential configuration for Unity Catalog. Use when you need to verify that a storage credential can successfully access a cloud storage location. Requires metastore admin, storage credential owner, or CREATE_EXTERNAL_LOCATION privilege.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `url` | string | No | The external location URL to validate. At least one of external_location_name or url must be provided. If both are provided, url will be used for validation |
| `read_only` | boolean | No | Whether the storage credential is usable only for read operations |
| `aws_iam_role` | object | No | AWS IAM role configuration for storage credential validation. |
| `cloudflare_api_token` | object | No | Cloudflare API token configuration for storage credential validation. |
| `azure_managed_identity` | object | No | Azure managed identity configuration for storage credential validation. |
| `external_location_name` | string | No | Name of an existing external location to validate. At least one of external_location_name or url must be provided |
| `azure_service_principal` | object | No | Azure service principal configuration for storage credential validation. |
| `storage_credential_name` | string | No | Name of an existing storage credential to validate. Either this or a cloud-specific credential must be provided |
| `databricks_gcp_service_account` | object | No | GCP service account configuration for storage credential validation. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Disable System Schema

**Slug:** `DATABRICKS_CATALOG_SYSTEM_SCHEMAS_DISABLE`

Tool to disable a system schema in Unity Catalog metastore. Use when you need to remove a system schema from the system catalog. System schemas store information about customer usage patterns such as audit logs, billing information, and lineage data. Requires account admin or metastore admin privileges.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `schema_name` | string | Yes | Full name of the system schema to disable. Valid system schema names include: access, billing, compute, storage, lineage, marketplace |
| `metastore_id` | string | Yes | The metastore ID under which the system schema lives. This is the unique identifier for the Unity Catalog metastore. The caller must be an account admin or metastore admin |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Enable System Schema

**Slug:** `DATABRICKS_CATALOG_SYSTEM_SCHEMAS_ENABLE`

Tool to enable a system schema in Unity Catalog metastore. Use when you need to activate a system schema to track customer usage patterns. System schemas store information about audit logs, billing, compute usage, storage, lineage, and marketplace data. Requires account admin or metastore admin privileges.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `schema_name` | string | Yes | Full name of the system schema to enable. Valid system schema names include: access, billing, compute, storage, lineage, marketplace, information_schema, operational_data |
| `catalog_name` | string | No | The catalog for which the system schema is to be enabled in. Optional parameter to specify a particular catalog |
| `metastore_id` | string | Yes | The metastore ID under which the system schema lives. This is the unique identifier for the Unity Catalog metastore. The caller must be an account admin or metastore admin |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Catalog Table

**Slug:** `DATABRICKS_CATALOG_TABLES_DELETE`

Tool to delete a table from Unity Catalog. Use when you need to permanently remove a table from its parent catalog and schema. The operation requires appropriate permissions on the parent catalog, schema, and table.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `full_name` | string | Yes | Full three-level namespace name of the table to delete. Must follow the format: catalog_name.schema_name.table_name. The caller must be the parent catalog owner, have USE_CATALOG privilege on parent catalog AND be parent schema owner, or be the table owner AND have USE_CATALOG on parent catalog AND USE_SCHEMA on parent schema |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Check Table Exists

**Slug:** `DATABRICKS_CATALOG_TABLES_EXISTS`

Tool to check if a table exists in Unity Catalog metastore. Use when you need to verify table existence before performing operations. Requires metastore admin privileges, table ownership with SELECT privilege, or USE_CATALOG and USE_SCHEMA privileges on parent objects.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `full_name` | string | Yes | The complete three-level name of the table in the format 'catalog_name.schema_name.table_name' (e.g., 'samples.nyctaxi.trips'). The caller must be a metastore admin, table owner with SELECT privilege, or have USE_CATALOG and USE_SCHEMA privileges |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Catalog Table Details

**Slug:** `DATABRICKS_CATALOG_TABLES_GET`

Tool to retrieve comprehensive metadata about a table from Unity Catalog metastore. Use when you need detailed table information including columns, type, storage, constraints, and governance metadata. Requires metastore admin privileges, table ownership, or SELECT privilege on the table, plus USE_CATALOG and USE_SCHEMA privileges on parent objects.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `full_name` | string | Yes | The three-level (fully qualified) name of the table in the format 'catalog_name.schema_name.table_name' (e.g., 'samples.nyctaxi.trips'). The caller must be a metastore admin, table owner, or have SELECT privilege on the table, plus USE_CATALOG and USE_SCHEMA privileges |
| `include_browse` | boolean | No | Whether to include tables in the response for which the principal can only access selective metadata through the BROWSE privilege |
| `include_delta_metadata` | boolean | No | Whether delta metadata should be included in the response. Delta metadata includes information about table version, data files, and statistics |
| `include_manifest_capabilities` | boolean | No | Whether to include a manifest containing table capabilities in the response. Capabilities describe what operations are supported on the table |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Catalog Table

**Slug:** `DATABRICKS_CATALOG_TABLES_UPDATE`

Tool to update Unity Catalog table properties. Use when you need to change the owner or comment of a table. The caller must be the owner of the parent catalog, have the USE_CATALOG privilege on the parent catalog and be the owner of the parent schema, or be the owner of the table and have the USE_CATALOG privilege on the parent catalog and the USE_SCHEMA privilege on the parent schema.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `owner` | string | No | Username of the new owner of the table. This is the primary field that can be updated via this endpoint |
| `comment` | string | No | User-provided free-form text description for the table |
| `full_name` | string | Yes | The three-level (fully qualified) name of the table in the format 'catalog_name.schema_name.table_name' (e.g., 'samples.nyctaxi.trips') |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Generate Temporary Path Credentials

**Slug:** `DATABRICKS_CATALOG_TEMP_PATH_CREDS_GENERATE_TEMP_PATH_CREDS`

Tool to generate short-lived, scoped temporary credentials for accessing external storage locations registered in Unity Catalog. Use when you need temporary access to cloud storage paths with specific read/write permissions. The credentials inherit the privileges of the requesting principal and are valid for a limited time. The requesting principal must have EXTERNAL USE LOCATION privilege on the external location.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `url` | string | Yes | The cloud storage path URL for which temporary credentials are requested. Supports s3:// for AWS S3, abfss:// for Azure Data Lake Storage, gs:// for GCP Cloud Storage, and r2:// for Cloudflare R2 |
| `dry_run` | boolean | No | When set to true, the service will not validate that the generated credentials can perform write operations. Default is false |
| `operation` | string ("PATH_READ" | "PATH_READ_WRITE" | "PATH_CREATE_TABLE") | Yes | The type of operation to be performed on the path. PATH_READ provides read-only access, PATH_READ_WRITE provides read and write access, PATH_CREATE_TABLE provides permission to create tables at the specified path |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Catalog Volume Details

**Slug:** `DATABRICKS_CATALOG_VOLUMES_READ`

Tool to retrieve detailed information about a specific Unity Catalog volume. Use when you need to get volume metadata including type, storage location, owner, and timestamps. Requires metastore admin privileges or volume ownership with appropriate USE_CATALOG and USE_SCHEMA privileges on parent objects.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The three-level (fully qualified) name of the volume in the format 'catalog_name.schema_name.volume_name' (e.g., 'samples.nyctaxi.test_volume'). The caller must be a metastore admin or own the volume (and have USE_CATALOG privilege on the parent catalog and USE_SCHEMA privilege on the parent schema) |
| `include_browse` | boolean | No | Whether to include volumes in the response for which the principal can only access selective metadata through the BROWSE privilege |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Catalog Workspace Bindings

**Slug:** `DATABRICKS_CATALOG_WORKSPACE_BINDINGS_UPDATE_BINDINGS`

Tool to update workspace bindings for a Unity Catalog securable (catalog). Use when you need to control which workspaces can access a catalog. Allows adding or removing workspace bindings with read-write or read-only access. Caller must be a metastore admin or owner of the catalog.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `add` | array | No | List of workspace bindings to add or update. Each binding specifies a workspace ID and the access level (read-write or read-only) |
| `name` | string | Yes | The name of the catalog to update workspace bindings for |
| `remove` | array | No | List of workspace bindings to remove. Each binding specifies a workspace ID and the access level to remove |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Clean Room Asset

**Slug:** `DATABRICKS_CLEANROOMS_CLEAN_ROOM_ASSETS_GET`

Tool to retrieve detailed information about a specific asset within a Databricks Clean Room. Use when you need to get metadata and configuration for clean room assets such as tables, views, notebooks, volumes, or foreign tables.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `asset_name` | string | Yes | The fully qualified name of the asset. For Unity Catalog assets, format is: shared_catalog.shared_schema.asset_name |
| `asset_type` | string ("FOREIGN_TABLE" | "NOTEBOOK_FILE" | "TABLE" | "VIEW" | "VOLUME") | Yes | The type of the asset. Supported values: FOREIGN_TABLE, NOTEBOOK_FILE, TABLE, VIEW, VOLUME |
| `clean_room_name` | string | Yes | Name of the clean room |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Clean Room Auto-Approval Rule

**Slug:** `DATABRICKS_CLEANROOMS_CLEAN_ROOM_AUTO_APPROVAL_RULES_CREATE`

Tool to create a new auto-approval rule for a Databricks Clean Room. Use when you need to automatically approve notebooks shared by other collaborators that meet specific criteria. In 2-person clean rooms, auto-approve notebooks from the other collaborator using author_collaborator_alias. In multi-collaborator clean rooms, use author_scope=ANY_AUTHOR to auto-approve from any author.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `author_scope` | string | No | Scope of authors whose notebooks will be auto-approved. Valid value: ANY_AUTHOR. Only one of author_collaborator_alias and author_scope can be set. Use this in clean rooms with more than two collaborators to auto-approve notebooks from any author |
| `clean_room_name` | string | Yes | The name of the clean room where the auto-approval rule will be created |
| `author_collaborator_alias` | string | No | Collaborator alias of the author whose notebooks will be auto-approved. Only one of author_collaborator_alias and author_scope can be set. Use this for 2-person clean rooms or to auto-approve notebooks from a specific collaborator |
| `runner_collaborator_alias` | string | No | Collaborator alias of the runner who will execute the auto-approved notebooks. This designates a single runner for the approved notebook |
| `rule_owner_collaborator_alias` | string | No | The owner of the rule to whom the rule applies. This identifies the collaborator who owns this auto-approval rule |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Clean Room

**Slug:** `DATABRICKS_CLEANROOMS_CLEAN_ROOMS_CREATE`

Tool to create a new Databricks Clean Room for secure data collaboration with specified collaborators. Use when you need to establish a collaborative environment for multi-party data analysis. This is an asynchronous operation; the clean room starts in PROVISIONING state and becomes ACTIVE when ready. Requires metastore admin privileges or CREATE_CLEAN_ROOM privilege on the metastore.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the clean room. Cannot include spaces, periods, or forward slashes. Cannot be changed after creation |
| `comment` | string | No | User-provided free-form text description of the clean room |
| `remote_detailed_info` | object | Yes | Central clean room configuration including cloud provider, region, and collaborator details |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Compute Cluster Policy

**Slug:** `DATABRICKS_COMPUTE_CLUSTER_POLICIES_CREATE`

Tool to create a new cluster policy with prescribed settings for controlling cluster creation. Use when you need to establish policies that govern cluster configurations. Only admin users can create cluster policies.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | Cluster Policy name requested by the user. Must be unique within the workspace. Length must be between 1 and 100 characters |
| `libraries` | array | No | List of libraries to be automatically installed on clusters using this policy. Libraries are installed on next cluster restart. Maximum of 500 libraries allowed |
| `definition` | string | Yes | Policy definition document expressed in Databricks Cluster Policy Definition Language (JSON string). Cannot be used together with policy_family_id |
| `description` | string | No | Additional human-readable description of the cluster policy |
| `policy_family_id` | string | No | ID of the policy family. The cluster policy's definition inherits from the policy family's definition. Cannot be used together with definition parameter |
| `max_clusters_per_user` | integer | No | Maximum number of clusters per user that can be active using this policy. If not present, there is no maximum limit. If specified, value must be greater than zero |
| `policy_family_definition_overrides` | string | No | Policy definition JSON document expressed in Databricks Policy Definition Language. Allows customization of the policy definition inherited from the policy family. Policy rules specified here are merged into the inherited policy definition |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Compute Cluster Policy

**Slug:** `DATABRICKS_COMPUTE_CLUSTER_POLICIES_DELETE`

Tool to delete a cluster policy. Use when you need to remove a cluster policy from the workspace. Clusters governed by this policy can still run, but cannot be edited. Only workspace admin users can delete policies. This operation is permanent and cannot be undone.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `policy_id` | string | Yes | The ID of the policy to delete. Canonical unique identifier for the cluster policy |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Edit Compute Cluster Policy

**Slug:** `DATABRICKS_COMPUTE_CLUSTER_POLICIES_EDIT`

Tool to update an existing Databricks cluster policy. Use when you need to modify policy settings like name, definition, or restrictions. Note that this operation may make some clusters governed by the previous policy invalid.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | Cluster Policy name requested by the user. This has to be unique. Length must be between 1 and 100 characters |
| `libraries` | array | No | A list of libraries to be installed on the next cluster restart that uses this policy. The maximum number of libraries is 500 |
| `policy_id` | string | Yes | The ID of the policy to update. This is a required parameter |
| `definition` | string | No | Policy definition document expressed in Databricks Cluster Policy Definition Language (JSON string). Required unless policy_family_id is provided. Cannot be used with policy_family_id. |
| `description` | string | No | Additional human-readable description of the cluster policy |
| `policy_family_id` | string | No | ID of the policy family. The cluster policy's policy definition inherits the policy family's policy definition. Cannot be used with definition |
| `max_clusters_per_user` | integer | No | Max number of clusters per user that can be active using this policy. If not present, there is no max limit. Must be greater than 0 if specified |
| `policy_family_definition_overrides` | string | No | Policy definition JSON document for customizing inherited policy family definitions |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Compute Cluster Policy

**Slug:** `DATABRICKS_COMPUTE_CLUSTER_POLICIES_GET`

Tool to retrieve detailed information about a specific cluster policy by its ID. Use when you need to view the configuration and settings of an existing cluster policy.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `policy_id` | string | Yes | Canonical unique identifier for the cluster policy to retrieve |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Compute Cluster Policy Permission Levels

**Slug:** `DATABRICKS_COMPUTE_CLUSTER_POLICIES_GET_PERM_LEVELS`

Tool to retrieve available permission levels for a Databricks cluster policy. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific cluster policy. Returns permission levels like CAN_USE with their descriptions.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `cluster_policy_id` | string | Yes | The unique identifier of the cluster policy for which to retrieve available permission levels |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Compute Cluster Policy Permissions

**Slug:** `DATABRICKS_COMPUTE_CLUSTER_POLICIES_GET_PERMS`

Tool to retrieve permissions for a Databricks cluster policy. Use when you need to check who has access to a specific cluster policy and their permission levels. Returns the access control list with user, group, and service principal permissions, including inherited permissions from parent objects.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `cluster_policy_id` | string | Yes | The unique identifier of the cluster policy for which to retrieve permissions |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Set Compute Cluster Policy Permissions

**Slug:** `DATABRICKS_COMPUTE_CLUSTER_POLICIES_SET_PERMS`

Tool to set permissions for a Databricks cluster policy, replacing all existing permissions. Use when you need to configure access control for a cluster policy. This operation replaces ALL existing permissions; non-admin users must be granted permissions to access the policy. Workspace admins always have permissions on all policies.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `cluster_policy_id` | string | Yes | The unique identifier of the cluster policy for which to set permissions |
| `access_control_list` | array | Yes | Array of access control entries. This replaces ALL existing permissions. For each entry, only one of user_name, group_name, or service_principal_name should be specified |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Cluster Policy Permissions

**Slug:** `DATABRICKS_COMPUTE_CLUSTER_POLICIES_UPDATE_PERMS`

Tool to incrementally update permissions on a Databricks cluster policy. Use when you need to modify permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `cluster_policy_id` | string | Yes | The cluster policy resource identifier for which permissions are being modified |
| `access_control_list` | array | Yes | Collection of access control entries defining permissions to assign. This operation updates permissions incrementally without replacing existing permissions. For each entry, exactly one of user_name, group_name, or service_principal_name must be specified |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Compute Cluster Permission Levels

**Slug:** `DATABRICKS_COMPUTE_CLUSTERS_GET_PERMISSION_LEVELS`

Tool to retrieve available permission levels for a Databricks compute cluster. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific cluster. Returns permission levels like CAN_ATTACH_TO, CAN_RESTART, and CAN_MANAGE with their descriptions.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `cluster_id` | string | Yes | The unique identifier of the cluster for which to retrieve available permission levels |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Compute Cluster Node Types

**Slug:** `DATABRICKS_COMPUTE_CLUSTERS_LIST_NODE_TYPES`

Tool to list all supported Spark node types available for cluster launch in the workspace region. Use when you need to determine which instance types are available for creating or configuring clusters. Returns detailed specifications including compute resources, storage capabilities, and cloud-specific attributes for each node type.

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Compute Cluster Availability Zones

**Slug:** `DATABRICKS_COMPUTE_CLUSTERS_LIST_ZONES`

Tool to list availability zones where Databricks clusters can be created. Use when you need to determine available zones for cluster deployment or planning redundancy. Returns the default zone and a list of all zones available in the workspace's cloud region. This endpoint is available for AWS workspaces.

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Permanently Delete Compute Cluster

**Slug:** `DATABRICKS_COMPUTE_CLUSTERS_PERMANENT_DELETE`

Tool to permanently delete a Databricks compute cluster. Use when you need to irreversibly remove a cluster and its resources. After permanent deletion, the cluster will no longer appear in the cluster list and cannot be recovered.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `cluster_id` | string | Yes | The unique identifier of the cluster to permanently delete. This operation is irreversible and will asynchronously remove all cluster resources |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Pin Compute Cluster

**Slug:** `DATABRICKS_COMPUTE_CLUSTERS_PIN`

Tool to pin a Databricks compute cluster configuration. Use when you need to preserve a cluster's configuration beyond the standard 30-day retention period. This operation is idempotent - pinning an already-pinned cluster has no effect. Requires workspace administrator privileges.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `cluster_id` | string | Yes | The unique identifier of the cluster to pin. Pinning a cluster ensures its configuration is retained beyond the standard 30-day period for terminated clusters and is always returned by the List Clusters API |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Compute Cluster Spark Versions

**Slug:** `DATABRICKS_COMPUTE_CLUSTERS_SPARK_VERSIONS`

Tool to list all available Databricks Runtime Spark versions for cluster creation. Use when you need to determine which Spark versions are available for creating or configuring clusters. The 'key' field from the response should be used as the 'spark_version' parameter when creating clusters.

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Start Compute Cluster

**Slug:** `DATABRICKS_COMPUTE_CLUSTERS_START`

Tool to start a terminated Databricks compute cluster asynchronously. Use when you need to restart a stopped cluster. The cluster transitions through PENDING state before reaching RUNNING. Poll cluster status to verify when fully started.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `cluster_id` | string | Yes | The unique identifier of the cluster to start. This cluster must be in a TERMINATED state. The cluster will preserve its previous configuration and cluster ID when started. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Unpin Compute Cluster

**Slug:** `DATABRICKS_COMPUTE_CLUSTERS_UNPIN`

Tool to unpin a Databricks compute cluster configuration. Use when you need to allow a cluster's configuration to be removed after termination. This operation is idempotent - unpinning an already-unpinned cluster has no effect. Requires workspace administrator privileges.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `cluster_id` | string | Yes | The unique identifier of the cluster to unpin. Unpinning a cluster allows it to be automatically removed from the List Clusters API after termination and eventual deletion after 30 days |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Compute Cluster

**Slug:** `DATABRICKS_COMPUTE_CLUSTERS_UPDATE`

Tool to partially update a Databricks compute cluster configuration using field masks. Use when you need to update specific cluster attributes without providing a full configuration. The update_mask specifies which fields to modify. Running clusters restart to apply changes; terminated clusters apply changes on next startup.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `autoscale` | object | No | Autoscaling configuration for cluster workers. |
| `policy_id` | string | No | References cluster policy for validation and defaults. Only applied if 'policy_id' is in update_mask. |
| `cluster_id` | string | Yes | The unique identifier of the cluster to update. The cluster must be in RUNNING or TERMINATED state. |
| `spark_conf` | object | No | Spark configuration key-value pairs. Only applied if 'spark_conf' is in update_mask. |
| `custom_tags` | object | No | Additional tags for cluster resources (max 45 tags). Only applied if 'custom_tags' is in update_mask. |
| `num_workers` | integer | No | Number of worker nodes. Use 0 for single-node clusters. Only applied if 'num_workers' is in update_mask. |
| `update_mask` | string | Yes | Comma-separated list of field paths to update (e.g., 'cluster_name', 'num_workers,autotermination_minutes'). Use dot notation for nested fields. Field paths are relative to the cluster resource. |
| `cluster_name` | string | No | Cluster name requested by the user. Only applied if 'cluster_name' is in update_mask. |
| `docker_image` | object | No | Docker container configuration. |
| `init_scripts` | array | No | Scripts executing during cluster startup. Only applied if 'init_scripts' is in update_mask. |
| `node_type_id` | string | No | Instance type for worker nodes. Only applied if 'node_type_id' is in update_mask. |
| `spark_version` | string | No | The Spark version of the cluster (e.g., '11.3.x-scala2.12', '12.2.x-scala2.12'). Only applied if 'spark_version' is in update_mask. |
| `workload_type` | object | No | Workload type restrictions for cluster usage. |
| `aws_attributes` | object | No | AWS-specific cluster configuration. |
| `gcp_attributes` | object | No | GCP-specific cluster configuration. |
| `runtime_engine` | string ("STANDARD" | "PHOTON") | No | Runtime engine type. Only applied if 'runtime_engine' is in update_mask. |
| `spark_env_vars` | object | No | Environment variables for Spark processes. Only applied if 'spark_env_vars' is in update_mask. |
| `ssh_public_keys` | array | No | List of SSH public keys (up to 10). Only applied if 'ssh_public_keys' is in update_mask. |
| `azure_attributes` | object | No | Azure-specific cluster configuration. |
| `cluster_log_conf` | object | No | Cluster log delivery configuration. |
| `instance_pool_id` | string | No | Instance pool identifier for worker nodes. Only applied if 'instance_pool_id' is in update_mask. |
| `single_user_name` | string | No | Required when using SINGLE_USER security mode. Only applied if 'single_user_name' is in update_mask. |
| `data_security_mode` | string ("SINGLE_USER" | "USER_ISOLATION" | "LEGACY_TABLE_ACL" | "LEGACY_PASSTHROUGH" | "LEGACY_SINGLE_USER") | No | Access control mode for the cluster. Only applied if 'data_security_mode' is in update_mask. |
| `driver_node_type_id` | string | No | The node type of the Spark driver. Only applied if 'driver_node_type_id' is in update_mask. |
| `enable_elastic_disk` | boolean | No | Enables autoscaling storage, monitors free disk space. Only applied if 'enable_elastic_disk' is in update_mask. |
| `autotermination_minutes` | integer | No | Automatically terminate the cluster after it is inactive for this time in minutes. Only applied if 'autotermination_minutes' is in update_mask. |
| `driver_instance_pool_id` | string | No | Instance pool identifier for driver node. Only applied if 'driver_instance_pool_id' is in update_mask. |
| `apply_policy_default_values` | boolean | No | When true, fixed and default values from the policy will be used for omitted fields. Only applied if 'apply_policy_default_values' is in update_mask. |
| `enable_local_disk_encryption` | boolean | No | Encrypts temporary cluster storage using LUKS. Only applied if 'enable_local_disk_encryption' is in update_mask. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Global Init Script

**Slug:** `DATABRICKS_COMPUTE_GLOBAL_INIT_SCRIPTS_CREATE`

Tool to create a new global initialization script in Databricks workspace. Use when you need to run scripts on every node in every cluster. Global init scripts run on all cluster nodes and only workspace admins can create them. Scripts execute in position order and clusters must restart to apply changes. The script cannot exceed 64KB when decoded.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the script. Must be provided and should be descriptive of the script's purpose |
| `script` | string | Yes | The Base64-encoded content of the script. The decoded script content must not exceed 64KB in size |
| `enabled` | boolean | No | Specifies whether the script is enabled. The script runs only if enabled. If not specified, the default behavior applies |
| `position` | integer | No | The position of a global init script, where 0 represents the first script to run, 1 is the second script to run, in ascending order. If omitted, defaults to last position (will run after all current scripts). Setting any value greater than the position of the last script is equivalent to the last position. If an explicit position value conflicts with an existing script value, the request succeeds, but the original script at that position and all later scripts have their positions incremented by 1 |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Global Init Script

**Slug:** `DATABRICKS_COMPUTE_GLOBAL_INIT_SCRIPTS_DELETE`

Tool to delete a global initialization script from Databricks workspace. Use when you need to remove a script that runs on every cluster node. Requires workspace administrator privileges. Clusters must restart to reflect the removal of the script.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `script_id` | string | Yes | The unique identifier of the global init script to delete. This is the ID returned when the script was created |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Global Init Script

**Slug:** `DATABRICKS_COMPUTE_GLOBAL_INIT_SCRIPTS_GET`

Tool to retrieve complete details of a global initialization script in Databricks workspace. Use when you need to view script configuration, Base64-encoded content, or metadata. Returns all script details including creation/update timestamps and whether the script is enabled.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `script_id` | string | Yes | The unique identifier of the global init script to retrieve. This is the ID returned when the script was created |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Global Init Script

**Slug:** `DATABRICKS_COMPUTE_GLOBAL_INIT_SCRIPTS_UPDATE`

Tool to update a global initialization script in Databricks workspace. Use when you need to modify script content, name, enabled status, or execution order. All fields are optional; unspecified fields retain their current value. Existing clusters must be restarted to pick up changes.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | No | The name of the script. It should be unique |
| `script` | string | No | The Base64-encoded content of the script |
| `enabled` | boolean | No | Specifies whether the script is enabled. The script runs only if enabled |
| `position` | integer | No | The position of a script, where 0 represents the first script to run, 1 is the second script to run, in ascending order. If position is >= the position of the last script, the script will be placed at the end |
| `script_id` | string | Yes | The unique identifier of the global init script to update. This is the ID returned when the script was created |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Compute Instance Pool

**Slug:** `DATABRICKS_COMPUTE_INSTANCE_POOLS_CREATE`

Tool to create a new Databricks instance pool with specified configuration. Use when you need to set up a pool that reduces cluster start and auto-scaling times by maintaining idle, ready-to-use cloud instances. When attached to a pool, a cluster allocates driver and worker nodes from the pool.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `disk_spec` | object | No | Specification of disks attached to all spark containers. |
| `custom_tags` | object | No | Additional tags for pool resources. Databricks will tag all pool resources with these tags in addition to default_tags. Currently allows at most 43-45 custom tags |
| `max_capacity` | integer | No | Maximum number of outstanding instances to keep in the pool, including both instances used by clusters and idle instances. Clusters that require further instance provisioning will fail during upsize requests |
| `node_type_id` | string | Yes | Encodes the resources available to each Spark node (e.g., i3.xlarge, m5.large). A list of available node types can be retrieved using the List Node Types API |
| `aws_attributes` | object | No | AWS-specific attributes for the instance pool. |
| `gcp_attributes` | object | No | GCP-specific attributes for the instance pool. |
| `azure_attributes` | object | No | Azure-specific attributes for the instance pool. |
| `instance_pool_name` | string | Yes | Pool name requested by the user. Pool name must be unique. Length must be between 1 and 100 characters |
| `min_idle_instances` | integer | No | Minimum number of idle instances to keep in the instance pool. This is in addition to any instances in use by active clusters |
| `enable_elastic_disk` | boolean | No | Autoscaling Local Storage: when enabled, instances in this pool will dynamically acquire additional disk space when Spark workers are running low on disk space. In AWS, requires specific AWS permissions |
| `remote_disk_throughput` | integer | No | GCP HYPERDISK throughput in Mb/s |
| `preloaded_docker_images` | array | No | Custom Docker Image BYOC. Pool-backed clusters can use one of the preloaded images if it matches the docker_image specified in the create-cluster request |
| `preloaded_spark_versions` | array | No | A list of preloaded Spark image versions for the pool. Pool-backed clusters started with the preloaded Spark version will start faster. Up to one runtime version can be pre-installed on instances |
| `instance_pool_fleet_attributes` | object | No | Fleet-related settings to power the instance pool. |
| `total_initial_remote_disk_size` | integer | No | Initial remote disk volume size in GB |
| `idle_instance_autotermination_minutes` | integer | Yes | Automatically terminates extra instances in the pool after they are inactive for this time in minutes if min_idle_instances requirement is met. Threshold must be between 0 and 10000 minutes. If 0, excess idle instances are removed as soon as possible |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Compute Instance Pool

**Slug:** `DATABRICKS_COMPUTE_INSTANCE_POOLS_DELETE`

Tool to delete a Databricks compute instance pool. Use when you need to permanently remove an instance pool. The idle instances in the pool are terminated asynchronously after deletion.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `instance_pool_id` | string | Yes | The unique identifier of the instance pool to delete. The idle instances in the pool are terminated asynchronously |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Edit Compute Instance Pool

**Slug:** `DATABRICKS_COMPUTE_INSTANCE_POOLS_EDIT`

Tool to modify the configuration of an existing Databricks instance pool. Use when you need to update pool settings like capacity, termination minutes, or preloaded images. Note that the pool's node type cannot be changed after creation, though it must still be provided with the same value.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `disk_spec` | object | No | Specification of disks attached to all spark containers. |
| `custom_tags` | object | No | Additional tags for pool resources. Maximum 45 custom tags allowed |
| `max_capacity` | integer | No | Maximum number of outstanding instances (active + idle) to keep in the pool |
| `node_type_id` | string | Yes | Encodes resources available to each Spark node (e.g., i3.xlarge). Must match existing node type as it cannot be changed after pool creation |
| `aws_attributes` | object | No | AWS-specific attributes for the instance pool. |
| `gcp_attributes` | object | No | GCP-specific attributes for the instance pool. |
| `azure_attributes` | object | No | Azure-specific attributes for the instance pool. |
| `instance_pool_id` | string | Yes | Instance pool ID that identifies the pool to be modified |
| `instance_pool_name` | string | Yes | Pool name requested by the user. Must be unique, non-empty, length between 1-100 characters |
| `min_idle_instances` | integer | No | Minimum number of idle instances to keep in the pool. Default: 0 |
| `enable_elastic_disk` | boolean | No | When enabled, instances dynamically acquire additional disk space when running low |
| `preloaded_docker_images` | array | No | Custom Docker images to preload on pool instances |
| `preloaded_spark_versions` | array | No | List of Spark versions to preload for faster cluster starts |
| `instance_pool_fleet_attributes` | object | No | Fleet-related settings to power the instance pool. |
| `idle_instance_autotermination_minutes` | integer | No | Automatically terminates extra idle instances after inactive for this time in minutes. Range: 0-10000. Value of 0 instantly removes idle instances |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Instance Pool Details

**Slug:** `DATABRICKS_COMPUTE_INSTANCE_POOLS_GET`

Tool to retrieve detailed information about a Databricks instance pool by its ID. Use when you need to get instance pool configuration, capacity settings, preloaded images, and usage statistics. Instance pools reduce cluster start and auto-scaling times by maintaining idle, ready-to-use cloud instances.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `instance_pool_id` | string | Yes | The canonical unique identifier for the instance pool |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Instance Pool Permissions

**Slug:** `DATABRICKS_COMPUTE_INSTANCE_POOLS_GET_PERMISSIONS`

Tool to retrieve permissions for a Databricks instance pool. Use when you need to check who has access to a specific instance pool and their permission levels. Returns the access control list with user, group, and service principal permissions, including inherited permissions from parent objects.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `instance_pool_id` | string | Yes | The unique identifier of the instance pool for which to retrieve permissions |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Instance Pool Permission Levels

**Slug:** `DATABRICKS_COMPUTE_INSTANCE_POOLS_GET_PERM_LEVELS`

Tool to retrieve available permission levels for a Databricks instance pool. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific instance pool. Returns permission levels like CAN_ATTACH_TO and CAN_MANAGE with their descriptions.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `instance_pool_id` | string | Yes | The canonical unique identifier for the instance pool for which to retrieve available permission levels |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Set Compute Instance Pool Permissions

**Slug:** `DATABRICKS_COMPUTE_INSTANCE_POOLS_SET_PERMISSIONS`

Tool to set permissions for a Databricks instance pool, replacing all existing permissions. Use when you need to configure access control for an instance pool. This operation replaces ALL existing permissions. You must have CAN_MANAGE permission on a pool to configure its permissions.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `instance_pool_id` | string | Yes | The unique identifier of the instance pool for which to set permissions |
| `access_control_list` | array | Yes | Array of access control entries. This replaces ALL existing permissions. For each entry, only one of user_name, group_name, or service_principal_name should be specified |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Instance Pool Permissions

**Slug:** `DATABRICKS_COMPUTE_INSTANCE_POOLS_UPDATE_PERMS`

Tool to incrementally update permissions on a Databricks instance pool. Use when you need to modify permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `instance_pool_id` | string | Yes | The instance pool resource identifier for which permissions are being modified |
| `access_control_list` | array | Yes | Collection of access control entries defining permissions to assign. This operation updates permissions incrementally without replacing existing permissions. For each entry, exactly one of user_name, group_name, or service_principal_name must be specified |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Add Compute Instance Profile

**Slug:** `DATABRICKS_COMPUTE_INSTANCE_PROFILES_ADD`

Tool to register an instance profile in Databricks for cluster launches. Use when administrators need to grant users permission to launch clusters using that profile. Requires admin access. Successfully registered profiles enable clusters to use the associated IAM role.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `iam_role_arn` | string | No | The AWS IAM role ARN of the role associated with the instance profile, formatted as 'arn:aws:iam::<account-id>:role/<name>'. Required if your role name and instance profile name do not match AND you want to use the instance profile with Databricks SQL Serverless |
| `skip_validation` | boolean | No | By default, Databricks validates that it has sufficient permissions to launch instances with the instance profile using AWS dry-run mode for the RunInstances API. If validation fails with an error message that does not indicate an IAM related permission issue, you can pass this flag to skip the validation and forcibly add the instance profile |
| `instance_profile_arn` | string | Yes | The AWS ARN of the instance profile to register with Databricks. This is the ARN attribute of the EC2 instance profile association to AWS IAM role |
| `is_meta_instance_profile` | boolean | No | Flag indicating whether the instance profile should only be used in credential passthrough scenarios. When true, this means the instance profile contains a meta IAM role which could assume a wide range of roles, and therefore should always be used with authorization. Defaults to false |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Edit Compute Instance Profile

**Slug:** `DATABRICKS_COMPUTE_INSTANCE_PROFILES_EDIT`

Tool to modify an existing AWS EC2 instance profile registered with Databricks. Use when you need to update the IAM role ARN associated with an instance profile. This operation is only available to admin users. The IAM role ARN is required if both of the following are true: your role name and instance profile name do not match, and you want to use the instance profile with Databricks SQL Serverless.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `iam_role_arn` | string | No | The AWS IAM role ARN of the role associated with the instance profile. This field is required if your role name and instance profile name do not match and you want to use the instance profile with Databricks SQL Serverless. Otherwise, this field is optional |
| `instance_profile_arn` | string | Yes | The AWS ARN of the instance profile to edit. This identifies which instance profile to modify |
| `is_meta_instance_profile` | boolean | No | Boolean flag indicating whether the instance profile should only be used in credential passthrough scenarios. If true, it means the instance profile contains a meta IAM role which could assume a wide range of roles. Therefore it should always be used with authorization. This field is optional, the default value is false |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Remove Compute Instance Profile

**Slug:** `DATABRICKS_COMPUTE_INSTANCE_PROFILES_REMOVE`

Tool to remove an instance profile from Databricks. Use when you need to unregister an AWS instance profile ARN from Databricks. This operation is only accessible to admin users. Existing clusters with this instance profile will continue to function normally after removal.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `instance_profile_arn` | string | Yes | The AWS ARN of the instance profile to remove from Databricks. This field is required. Must be a valid AWS IAM instance profile ARN. Existing clusters with this instance profile will continue to function normally after removal |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Enforce Cluster Policy Compliance

**Slug:** `DATABRICKS_COMPUTE_POLICY_COMPL_FOR_CLUSTERS_ENFORCE_COMPL`

Tool to update a cluster to be compliant with the current version of its policy. Use when you need to enforce policy compliance on a cluster. The cluster can be updated if it is in a RUNNING or TERMINATED state. Note: Clusters created by Databricks Jobs, DLT, or Models cannot be enforced by this API.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `cluster_id` | string | Yes | The ID of the cluster you want to enforce policy compliance on. The cluster must be in a RUNNING or TERMINATED state. If RUNNING, the cluster will be restarted to apply changes. If TERMINATED, changes will take effect on next startup. |
| `validate_only` | boolean | No | If set to true, previews the changes that would be made to enforce compliance without actually updating the cluster. This allows you to see what changes would occur without applying them. Defaults to false. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Cluster Policy Compliance

**Slug:** `DATABRICKS_COMPUTE_POLICY_COMPL_FOR_CLUSTERS_GET_COMPL`

Tool to retrieve policy compliance status for a specific cluster. Use when you need to check whether a cluster meets the requirements of its assigned policy and identify any policy violations. Clusters could be out of compliance if their policy was updated after the cluster was last edited.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `cluster_id` | string | Yes | The unique identifier of the cluster for which to retrieve compliance status |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Compute Policy Families

**Slug:** `DATABRICKS_COMPUTE_POLICY_FAMILIES_GET`

Tool to retrieve information for a policy family by identifier and optional version. Use when you need to view Databricks-provided templates for configuring clusters for a particular use case. Policy families cannot be created, edited, or deleted by users.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `version` | integer | No | The version number for the family to fetch. Defaults to the latest version if not specified |
| `policy_family_id` | string | Yes | The unique identifier for the policy family to retrieve |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Databricks Cluster

**Slug:** `DATABRICKS_CREATE_CLUSTER`

Tool to create a new Databricks Spark cluster with specified configuration. Use when you need to provision compute resources for data processing. This is an asynchronous operation that returns a cluster_id immediately with the cluster in PENDING state. The cluster transitions through states until reaching RUNNING.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `autoscale` | object | No | Autoscaling configuration for cluster workers. |
| `policy_id` | string | No | Cluster policy ID for validation and defaults |
| `clone_from` | string | No | Source cluster ID to clone configuration and libraries from |
| `spark_conf` | object | No | Spark configuration key-value pairs |
| `custom_tags` | object | No | Additional tags for cluster resources. Maximum 45 tags |
| `num_workers` | integer | No | Number of worker nodes. Mutually exclusive with autoscale. Use 0 for single-node clusters |
| `cluster_name` | string | No | Cluster name requested by the user. Does not need to be unique |
| `docker_image` | object | No | Docker container configuration. |
| `init_scripts` | array | No | Initialization scripts executed sequentially during cluster startup |
| `node_type_id` | string | No | Instance type for worker nodes. Required unless using instance_pool_id |
| `spark_version` | string | Yes | The Spark version of the cluster. Retrieve available versions via clusters/spark-versions API |
| `workload_type` | object | No | Workload type restrictions for cluster usage. |
| `aws_attributes` | object | No | AWS-specific cluster configuration. |
| `cluster_source` | string | No | Cluster creation source (UI, JOB, API) |
| `gcp_attributes` | object | No | GCP-specific cluster configuration. |
| `runtime_engine` | string ("STANDARD" | "PHOTON") | No | Runtime engine type |
| `spark_env_vars` | object | No | Environment variables for Spark driver and worker processes |
| `ssh_public_keys` | array | No | SSH public keys for ubuntu user access on port 2200. Maximum 10 keys |
| `azure_attributes` | object | No | Azure-specific cluster configuration. |
| `cluster_log_conf` | object | No | Cluster log delivery configuration. |
| `instance_pool_id` | string | No | Instance pool identifier for worker nodes. Alternative to node_type_id |
| `single_user_name` | string | No | Username when using SINGLE_USER security mode |
| `idempotency_token` | string | No | Optional token for creation idempotency. Maximum 64 characters |
| `data_security_mode` | string ("SINGLE_USER" | "USER_ISOLATION" | "LEGACY_TABLE_ACL" | "LEGACY_PASSTHROUGH" | "LEGACY_SINGLE_USER" | "NONE") | No | Data governance model for the cluster |
| `driver_node_type_id` | string | No | The node type of the Spark driver. Defaults to node_type_id if unspecified |
| `enable_elastic_disk` | boolean | No | Autoscale EBS/disk volumes when low on space |
| `autotermination_minutes` | integer | No | Automatically terminate the cluster after it is inactive for this time in minutes. Must be between 10-10000 minutes, or 0 to disable |
| `driver_instance_pool_id` | string | No | Instance pool identifier for driver node |
| `apply_policy_default_values` | boolean | No | When true, fixed and default values from the policy will be used for omitted fields |
| `enable_local_disk_encryption` | boolean | No | Enable LUKS encryption on local disks for temporary cluster storage |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Genie Message

**Slug:** `DATABRICKS_DASHBOARDS_GENIE_CREATE_MESSAGE`

Tool to create a message in a Genie conversation and get AI-generated responses. Use when you need to ask questions or send messages to Genie for data analysis. The response initially has status 'IN_PROGRESS' and should be polled every 1-5 seconds until reaching COMPLETED, FAILED, or CANCELLED status. Subject to 5 queries-per-minute rate limit during Public Preview.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `content` | string | Yes | User message content - the question or message to send to Genie |
| `space_id` | string | Yes | The ID associated with the Genie space where the conversation exists |
| `conversation_id` | string | Yes | The ID associated with the conversation |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Genie Space

**Slug:** `DATABRICKS_DASHBOARDS_GENIE_CREATE_SPACE`

Tool to create a new Genie space from a serialized payload for programmatic space management. Use when you need to create a Genie workspace for AI-powered data analysis. The space requires a SQL warehouse ID and a serialized configuration with at minimum a version field (1 or 2). Optionally specify data source tables to enable querying specific datasets.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `description` | string | No | Optional description of the Genie space to provide context about its purpose and contents. |
| `parent_path` | string | No | Parent folder path where the space will be registered in the workspace. If not specified, defaults to the user's home directory. |
| `display_name` | string | No | Display name for the Genie space. If not provided, the API will generate a default name like 'New Space'. This name is shown in the UI. |
| `warehouse_id` | string | Yes | The ID of the SQL warehouse to associate with the new Genie space. This warehouse will be used to execute queries in the space. You must have at least CAN USE permission on the warehouse. Note: The warehouse_id is stored but not validated during space creation. |
| `serialized_space` | string | Yes | The contents of the Genie Space in serialized JSON string form. Must include a 'version' field (1 or 2). Optionally include 'data_sources' with a 'tables' array containing table identifiers in catalog.schema.table format. The API will normalize version 1 to version 2. Minimal valid example: '{"version":1}'. For production use, include data_sources to specify which tables the Genie space can query. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Genie Conversation

**Slug:** `DATABRICKS_DASHBOARDS_GENIE_DELETE_CONVERSATION`

Tool to delete a conversation from a Genie space programmatically. Use when you need to remove conversations to manage the Genie space limits (10,000 conversations per space). Useful for deleting older or test conversations that are no longer needed.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `space_id` | string | Yes | The unique identifier of the Genie space containing the conversation to delete |
| `conversation_id` | string | Yes | The unique identifier of the conversation to delete |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Genie Conversation Message

**Slug:** `DATABRICKS_DASHBOARDS_GENIE_DELETE_CONV_MESSAGE`

Tool to delete a specific message from a Genie conversation. Use when you need to remove individual messages from conversations. This operation permanently deletes the message and cannot be undone.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `space_id` | string | Yes | The ID associated with the Genie space where the message is located |
| `message_id` | string | Yes | The ID associated with the message to delete |
| `conversation_id` | string | Yes | The ID associated with the conversation |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Execute Message Attachment Query

**Slug:** `DATABRICKS_DASHBOARDS_GENIE_EXECUTE_MESSAGE_ATTACH_QUERY`

Tool to execute SQL query for an expired message attachment in a Genie space. Use when a query attachment has expired and needs to be re-executed to retrieve fresh results. Returns SQL statement execution results with schema, metadata, and data.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `space_id` | string | Yes | Genie space ID - The unique identifier for the Genie space |
| `message_id` | string | Yes | Message ID - The unique identifier for the message within the conversation |
| `attachment_id` | string | Yes | Attachment ID - The unique identifier for the query attachment associated with the message |
| `conversation_id` | string | Yes | Conversation ID - The unique identifier for the conversation within the space |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Execute Genie Message Query

**Slug:** `DATABRICKS_DASHBOARDS_GENIE_EXECUTE_MESSAGE_QUERY`

Execute the SQL query associated with a Genie message and retrieve result data. DEPRECATED: This endpoint is deprecated in favor of Execute Message Attachment Query (execute_message_attachment_query). The new method requires an attachment_id parameter in addition to space_id, conversation_id, and message_id. Use this action to run the query generated by Genie AI for a specific message. Returns query execution results including data rows, execution status, and schema information. This is useful for re-executing queries when needed.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `space_id` | string | Yes | The ID associated with the Genie space where the conversation is located |
| `message_id` | string | Yes | The ID associated with the message containing the query to execute |
| `conversation_id` | string | Yes | The ID associated with the target conversation |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Genie Message

**Slug:** `DATABRICKS_DASHBOARDS_GENIE_GET_MESSAGE`

Tool to retrieve details of a specific message from a Genie conversation. Use when you need to get message content, status, attachments, or check processing status of a previously created message.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `space_id` | string | Yes | The ID associated with the Genie space where the target conversation is located |
| `message_id` | string | Yes | The ID associated with the target message from the identified conversation |
| `conversation_id` | string | Yes | The ID associated with the target conversation |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Genie Message Attachment Query Result

**Slug:** `DATABRICKS_DASHBOARDS_GENIE_GET_MESSAGE_ATTACH_QUERY_RESULT`

Tool to retrieve SQL query results from a Genie message attachment. Use when the message status is EXECUTING_QUERY or COMPLETED and you need to fetch the actual query execution results. Returns statement execution details including query data, schema, and metadata with a maximum of 5000 rows.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `space_id` | string | Yes | The Genie space ID |
| `message_id` | string | Yes | The message ID |
| `attachment_id` | string | Yes | The query attachment ID from the message response |
| `conversation_id` | string | Yes | The conversation ID |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Genie Message Query Result

**Slug:** `DATABRICKS_DASHBOARDS_GENIE_GET_MESSAGE_QUERY_RESULT`

Tool to retrieve SQL query execution results for a Genie message (up to 5000 rows). Use when message status is EXECUTING_QUERY or COMPLETED and the message has a query attachment. Returns query results with schema, metadata, and data in inline or external link format.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `space_id` | string | Yes | The ID associated with the Genie space where the conversation is located |
| `message_id` | string | Yes | The ID associated with the message that has a query attachment |
| `conversation_id` | string | Yes | The ID associated with the conversation |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Genie Message Query Result

**Slug:** `DATABRICKS_DASHBOARDS_GENIE_GET_MESSAGE_QUERY_RESULT_ATTACH`

Tool to retrieve SQL query execution results for a message attachment in a Genie space conversation. Use when you need to fetch query results from a Genie conversation message. Note: This endpoint is deprecated; consider using GetMessageAttachmentQueryResult instead. Returns results only when message status is EXECUTING_QUERY or COMPLETED. Maximum 5,000 rows per result.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `space_id` | string | Yes | The Genie space ID |
| `message_id` | string | Yes | The message ID containing the query attachment |
| `attachment_id` | string | Yes | The attachment ID referencing the query results |
| `conversation_id` | string | Yes | The conversation ID |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Genie Space Details

**Slug:** `DATABRICKS_DASHBOARDS_GENIE_GET_SPACE`

Tool to retrieve detailed information about a specific Databricks Genie space by ID. Use when you need to get configuration details, metadata, and optionally the serialized space content for backup or promotion across workspaces. Requires at least CAN EDIT permission to retrieve the serialized space content.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `space_id` | string | Yes | The unique identifier for the Genie space to retrieve |
| `include_serialized_space` | boolean | No | When set to true, includes the serialized string representation of the Genie space in the response, including instructions, benchmarks, joins, and other configuration details. Requires at least CAN EDIT permission on the space. Default: false |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Genie Conversations

**Slug:** `DATABRICKS_DASHBOARDS_GENIE_LIST_CONVERSATIONS`

Tool to retrieve all existing conversation threads within a Genie space. Use when you need to view conversations in a Genie space, either for the current user or all users if you have CAN MANAGE permission. Supports pagination for spaces with many conversations.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `space_id` | string | Yes | The unique identifier of the Genie space containing the conversations to retrieve |
| `page_size` | integer | No | Maximum number of conversations to return per page. Use for pagination to limit the number of results returned. |
| `page_token` | string | No | Token to get the next page of results. Use the next_page_token from a previous response to retrieve the next page. |
| `include_all` | boolean | No | Include all conversations in the space across all users. Requires at least CAN MANAGE permission on the space. If not set, only returns conversations created by the requesting user. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Genie Conversation Messages

**Slug:** `DATABRICKS_DASHBOARDS_GENIE_LIST_CONV_MESSAGES`

Tool to retrieve all messages from a specific conversation thread in a Genie space. Use when you need to view the complete message history of a conversation including user queries and AI responses. Supports pagination for conversations with many messages.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `space_id` | string | Yes | The unique identifier of the Genie space where the conversation is located |
| `page_size` | integer | No | Maximum number of messages to return per page. Use for pagination to limit the number of results returned. |
| `page_token` | string | No | Token to get the next page of results. Use the next_page_token from a previous response to retrieve subsequent pages. |
| `conversation_id` | string | Yes | The unique identifier of the conversation thread to list messages from |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Genie Spaces

**Slug:** `DATABRICKS_DASHBOARDS_GENIE_LIST_SPACES`

Tool to retrieve all Genie spaces in the workspace that the authenticated user has access to. Use when you need to list available Genie spaces, their metadata, and warehouse associations. Supports pagination for workspaces with many spaces.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `page_size` | integer | No | Maximum number of spaces to return per page. Used for pagination to control the number of results returned in a single request. |
| `page_token` | string | No | Pagination token for getting the next page of results. Use the next_page_token value from the previous response to retrieve the next page. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Send Genie Message Feedback

**Slug:** `DATABRICKS_DASHBOARDS_GENIE_SEND_MESSAGE_FEEDBACK`

Tool to send feedback for a Genie message. Use when you need to provide positive, negative, or no feedback rating for AI-generated messages in Genie conversations. Positive feedback on responses that join tables or use SQL expressions can prompt Genie to suggest new SQL snippets to space managers for review and approval.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `rating` | string ("POSITIVE" | "NEGATIVE" | "NONE") | Yes | The feedback rating. POSITIVE indicates a positive feedback rating, NEGATIVE indicates a negative feedback rating, NONE indicates no feedback rating provided |
| `space_id` | string | Yes | The ID associated with the Genie space where the message is located |
| `message_id` | string | Yes | The ID associated with the message to provide feedback for |
| `conversation_id` | string | Yes | The ID associated with the conversation |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Start Genie Conversation

**Slug:** `DATABRICKS_DASHBOARDS_GENIE_START_CONVERSATION`

Tool to start a new Genie conversation in a Databricks space for natural language data queries. Use when you need to ask questions about data using natural language. The message processes asynchronously, so initial status will be IN_PROGRESS. Poll the message status to get the completed response with query results.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `content` | string | Yes | The text of the message that starts the conversation. This is the user's initial question or prompt |
| `space_id` | string | Yes | The ID of the Genie space where you want to start a conversation |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Trash Genie Space

**Slug:** `DATABRICKS_DASHBOARDS_GENIE_TRASH_SPACE`

Tool to move a Genie space to trash instead of permanently deleting it. Use when you need to remove a Genie space while retaining recovery options. Trashed spaces follow standard Databricks trash behavior with 30-day retention before permanent deletion. Requires CAN MANAGE permission on the space.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `space_id` | string | Yes | The unique identifier of the Genie space to move to trash. This ID uniquely identifies the Genie space within the workspace |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Genie Space

**Slug:** `DATABRICKS_DASHBOARDS_GENIE_UPDATE_SPACE`

Tool to update an existing Genie space configuration. Use when you need to modify a Genie space's title, description, warehouse assignment, or complete serialized configuration. Supports partial updates (only provide fields you want to change) or full replacement via serialized_space. Useful for CI/CD pipelines, version control, and automated space management.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier for the Genie space to update. This is the space_id from the list spaces API response. |
| `title` | string | No | Updated title/display name for the Genie Space. If provided, this will override the existing title. |
| `description` | string | No | Updated description of the Genie Space. Describes the space's purpose, data sources, and intended use cases. |
| `warehouse_id` | string | No | Updated SQL warehouse ID to associate with the Genie Space. If provided, this warehouse will be used for executing queries in this space. |
| `serialized_space` | string | No | The complete serialized configuration of the Genie Space in JSON string format. This allows for full replacement of the space configuration including sample queries, instructions, and data sources. Retrieved from Get Genie Space API with include_serialized_space=true parameter. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Lakeview Dashboard

**Slug:** `DATABRICKS_DASHBOARDS_LAKEVIEW_CREATE`

Tool to create a new Lakeview dashboard in Databricks. Use when you need to create AI/BI dashboards for data visualization and analytics. Both display_name and serialized_dashboard are required. To create a blank dashboard, provide a minimal serialized_dashboard with an empty pages array like '{"pages":[{"name":"page_001","displayName":"Page 1"}]}'.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `parent_path` | string | No | The workspace path of the folder where the dashboard should be stored |
| `display_name` | string | Yes | The name to display for the dashboard |
| `warehouse_id` | string | No | The UUID of the warehouse to use as the default warehouse for the dashboard. Can be omitted to use workspace defaults |
| `serialized_dashboard` | string | Yes | JSON string containing the dashboard configuration and content. Must include at minimum a 'pages' array with at least one page object containing 'name' and 'displayName' fields. Use a minimal structure like '{"pages":[{"name":"page_001","displayName":"Page 1"}]}' to create a blank dashboard |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Lakeview Dashboard Schedule

**Slug:** `DATABRICKS_DASHBOARDS_LAKEVIEW_DELETE_SCHEDULE`

Tool to delete a dashboard schedule from a Lakeview dashboard. Use when you need to remove scheduled refreshes or updates for a dashboard. Provide the etag parameter to ensure the schedule hasn't been modified since last retrieval (optimistic concurrency control).

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | The etag for the schedule. Can be provided to verify that the schedule has not been modified from its last retrieval. Used for optimistic concurrency control |
| `schedule_id` | string | Yes | UUID identifying the schedule to delete. This is a 32-digit alphanumeric value |
| `dashboard_id` | string | Yes | UUID identifying the dashboard to which the schedule belongs. This is a 32-digit alphanumeric value |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Published Dashboard Token Info

**Slug:** `DATABRICKS_DASHBOARDS_LAKEVIEW_EMBEDDED_GET_PUBLISHED_DASH`

Retrieves authorization information needed to generate a downscoped OAuth token for embedding a published Lakeview dashboard for external viewers. This endpoint returns the custom claims, scopes, and authorization details required to create a secure, limited-access OAuth token that allows external users to view a specific published dashboard. The returned information ensures the token has minimal necessary permissions and supports row-level security through custom claims. Use this when implementing embedded dashboard experiences for external users who should not have direct access to your Databricks workspace. The dashboard must first be published with embed_credentials enabled.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `dashboard_id` | string | Yes | The UUID identifier of the published Lakeview dashboard. The dashboard must be published with embed_credentials enabled to use this endpoint |
| `external_value` | string | Yes | A value that can be used for row-level data filtering in dashboard queries. This value will be embedded in the custom claim and can contain PII. Commonly used for email addresses or department identifiers |
| `external_viewer_id` | string | Yes | A unique, non-PII identifier for the external viewer. This is used to track the viewer and generate the custom claim. Should not contain personally identifiable information like email addresses or real names |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Lakeview Dashboard Details

**Slug:** `DATABRICKS_DASHBOARDS_LAKEVIEW_GET`

Tool to retrieve details about a draft AI/BI Lakeview dashboard from the workspace. Use when you need to get comprehensive information about a dashboard including metadata, configuration, state, and serialized dashboard content.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `dashboard_id` | string | Yes | The UUID identifier for the dashboard. This is a 32-digit alphanumeric value |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Published Lakeview Dashboard

**Slug:** `DATABRICKS_DASHBOARDS_LAKEVIEW_GET_PUBLISHED`

Tool to retrieve the current published version of a Lakeview dashboard. Use when you need to get information about the published dashboard including its display name, embedded credentials status, warehouse configuration, and last revision timestamp.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `dashboard_id` | string | Yes | The UUID identifier for the published dashboard. This is a 32-digit alphanumeric value that uniquely identifies the dashboard |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Lakeview Dashboard Schedule

**Slug:** `DATABRICKS_DASHBOARDS_LAKEVIEW_GET_SCHEDULE`

Tool to retrieve a specific schedule for a Databricks AI/BI Lakeview dashboard. Use when you need to get schedule details including cron expressions, pause status, warehouse configuration, and subscription information. Each dashboard can have up to 10 schedules, with each schedule supporting up to 100 subscriptions.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `schedule_id` | string | Yes | UUID identifying the schedule to retrieve. This is a 32-digit alphanumeric value |
| `dashboard_id` | string | Yes | UUID identifying the dashboard to which the schedule belongs. This is a 32-digit alphanumeric value |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Publish Lakeview Dashboard

**Slug:** `DATABRICKS_DASHBOARDS_LAKEVIEW_PUBLISH`

Tool to publish an AI/BI Lakeview dashboard making it accessible via public link. Use when you need to publish a draft dashboard with embedded credentials and assign a warehouse for query execution. After successful publication, the dashboard becomes accessible at https://<deployment-url>/dashboardsv3/<resource_id>/published.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `dashboard_id` | string | Yes | The UUID identifier of the dashboard to publish. This is a 32-digit alphanumeric value |
| `warehouse_id` | string | No | The UUID of the warehouse to be used for running the published dashboard. This overrides any warehouse specified in the draft dashboard. If omitted, the published dashboard uses the same warehouse assigned to the draft dashboard |
| `embed_credentials` | boolean | No | Flag to indicate if the publisher's credentials should be embedded in the published dashboard. When true, the publisher's credentials are embedded and will be used to execute the published dashboard's queries. The publisher cannot embed different user's credentials. Defaults to true |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Trash Lakeview Dashboard

**Slug:** `DATABRICKS_DASHBOARDS_LAKEVIEW_TRASH`

Tool to move a Lakeview dashboard to trash instead of permanently deleting it. Use when you need to remove a dashboard while retaining recovery options. Trashed dashboards can be recovered within 30 days before permanent deletion.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `dashboard_id` | string | Yes | The UUID identifier for the dashboard to be trashed. This is a 32-digit alphanumeric value |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Unpublish Lakeview Dashboard

**Slug:** `DATABRICKS_DASHBOARDS_LAKEVIEW_UNPUBLISH`

Tool to unpublish an AI/BI Lakeview dashboard while preserving its draft version. Use when you need to remove the published version of a dashboard. The draft version remains available and can be republished later if needed.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `dashboard_id` | string | Yes | The UUID identifier of the dashboard to unpublish. This is a 32-digit alphanumeric value |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Lakeview Dashboard

**Slug:** `DATABRICKS_DASHBOARDS_LAKEVIEW_UPDATE`

Tool to update a draft Lakeview dashboard configuration and metadata. Use when you need to modify dashboard properties such as display name, warehouse, location, or content. This is a partial update operation - only provided fields will be updated. The etag field can be used for optimistic concurrency control to prevent conflicts from concurrent modifications.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | The etag for the dashboard. Used for optimistic concurrency control. If the version specified in the etag does not match the current version, the update is rejected |
| `parent_path` | string | No | The workspace path of the folder containing the dashboard (with leading slash). When provided, moves the dashboard to a new location |
| `dashboard_id` | string | Yes | UUID identifying the dashboard to update. This is required to identify which dashboard to modify |
| `display_name` | string | No | The display name of the dashboard. When provided, updates the dashboard name |
| `warehouse_id` | string | No | The warehouse ID used to run the dashboard. When provided, updates the default warehouse |
| `dataset_schema` | string | No | Sets the default schema for all datasets in this dashboard. When provided, updates the default schema |
| `dataset_catalog` | string | No | Sets the default catalog for all datasets in this dashboard. When provided, updates the default catalog |
| `serialized_dashboard` | string | No | The contents of the dashboard in serialized string form. This field provides the structure of the JSON string that represents the dashboard's layout and components |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Register Lakebase Database as Catalog

**Slug:** `DATABRICKS_DATABASE_DATABASE_CREATE_DATABASE_CATALOG`

Tool to register a Lakebase Postgres database as a Unity Catalog catalog in Databricks. Use when you need to register a Lakebase database instance as a catalog in Unity Catalog for data governance and analytics. Requires CREATE CATALOG privilege on the metastore and access to the database instance.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the Unity Catalog catalog to create. Must be unique within the metastore |
| `comment` | string | No | Optional description for the catalog |
| `database_name` | string | Yes | The name of the Postgres database within the instance to register (default is 'databricks_postgres') |
| `database_instance_name` | string | Yes | The name of the Lakebase database instance to register |
| `create_database_if_not_exists` | boolean | No | Whether to create the Postgres database if it doesn't exist in the instance |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Database Instance

**Slug:** `DATABRICKS_DATABASE_DATABASE_CREATE_DATABASE_INSTANCE`

Creates a new Lakebase Provisioned database instance in Databricks. Lakebase is Databricks' serverless PostgreSQL-compatible database service for OLTP workloads. Use this action to provision a new database instance with specified compute capacity and configuration. The creator automatically receives database owner privileges with the databricks_superuser role, granting full administrative capabilities on the instance including schema management, user provisioning, and data operations. The instance will initially be in STARTING state during provisioning, then transition to AVAILABLE when ready for connections. Use the returned read_write_dns endpoint to connect PostgreSQL clients and applications. Note: Requires available credits/billing setup in your Databricks workspace. Instance creation may take several minutes to complete.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | Unique database instance identifier. Must be 1-63 characters long, containing only lowercase letters (a-z) and hyphens (-). Cannot start or end with a hyphen |
| `capacity` | string ("CU_1" | "CU_2" | "CU_4" | "CU_8") | No | Compute capacity units (CUs) allocated to the instance. Each CU provides approximately 16GB RAM with associated CPU and local SSD resources. CU_1 for minimal/testing workloads, CU_2 for light production workloads, CU_4 for moderate workloads, CU_8 for intensive workloads |
| `custom_tags` | array | No | Optional list of custom key-value tag pairs to organize and categorize the instance for billing, access control, or organizational purposes |
| `usage_policy_id` | string | No | Optional usage policy identifier to govern and restrict instance usage patterns and access controls |
| `enable_pg_native_login` | boolean | No | Whether to enable native PostgreSQL password-based authentication. When enabled, allows direct PostgreSQL client connections using username/password credentials. Defaults to false (OAuth-based authentication only) |
| `retention_window_in_days` | integer | No | Point-in-time recovery (PITR) window in days. Determines how far back in time you can restore the database. Valid range: 2-35 days. If not specified, defaults to 7 days |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Database Instance

**Slug:** `DATABRICKS_DATABASE_DATABASE_DELETE_DATABASE_INSTANCE`

Delete a Databricks Lakebase Postgres database instance. Permanently removes the database instance and all associated data. Prerequisites: The instance should be stopped before deletion, and you must have CAN_MANAGE permissions on the instance. Warning: This operation cannot be undone. All data will be permanently deleted. Consider deleting associated Unity Catalog catalogs and synced tables first to avoid orphaned references.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The name of the database instance to delete. Must be 1-63 characters, containing only lowercase letters, numbers, and hyphens |
| `force` | boolean | No | When set to true, deletes child instances along with the parent instance. Use this to cascade deletion to descendant instances created via PITR (Point-in-Time Recovery). Default is false |
| `purge` | boolean | Yes | Safety confirmation parameter that must be set to true to delete the database instance. This confirms intentional deletion and is required by the API |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Synced Database Table

**Slug:** `DATABRICKS_DATABASE_DATABASE_DELETE_SYNCED_DATABASE_TABLE`

Tool to delete a synced table from Unity Catalog and stop data refreshes. Use when you need to deregister a synced table connection between Unity Catalog and a database instance. Note: The underlying Postgres table remains and must be manually dropped to free space.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `synced_table_name` | string | Yes | Full three-part name (catalog.schema.table) of the synced table to delete from Unity Catalog. Deleting stops data refreshes and deregisters the table, but the underlying Postgres table must be manually dropped to free space |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Find Database Instance By UID

**Slug:** `DATABRICKS_DATABASE_DATABASE_FIND_DATABASE_INSTANCE_BY_UID`

Tool to find a database instance by its unique identifier (UID). Use when you need to retrieve instance details using the immutable UUID instead of the instance name.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `uid` | string | Yes | UID of the database instance to retrieve. An immutable UUID identifier for the instance |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Generate Database Credential

**Slug:** `DATABRICKS_DATABASE_DATABASE_GENERATE_DATABASE_CRED`

Tool to generate OAuth token for database instance authentication. Use when you need to authenticate to Databricks database instances. The generated token is workspace-scoped and expires after one hour, though open connections remain active past expiration.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `claims` | string | No | Additional claims for the credential request. Added in SDK v0.58.0 for extending token capabilities |
| `request_id` | string | Yes | A unique request identifier, typically generated as a UUID. This identifier helps track the credential generation request |
| `instance_names` | array | Yes | List of database instance names for which credentials should be generated. These are the names of database instances that will be accessible with the generated token |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Database Instance

**Slug:** `DATABRICKS_DATABASE_DATABASE_GET_DATABASE_INSTANCE`

Retrieves detailed information about a Databricks Lakebase (managed PostgreSQL) database instance by name. Returns comprehensive instance details including: - Current state (AVAILABLE, STARTING, STOPPED, UPDATING, DELETING, FAILING_OVER) - Capacity/SKU configuration (CU_1, CU_2, CU_4, CU_8) - Connection endpoints (read-write DNS, read-only DNS) - PostgreSQL version and node configuration - Retention window and backup settings - High availability settings (readable secondaries) - Custom tags and usage policies - Parent/child instance relationships Use this when you need to check instance status, retrieve connection information, or verify configuration settings.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the database instance to retrieve. This is the unique identifier for the instance (1-63 characters, letters and hyphens only) |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Data Quality Monitor

**Slug:** `DATABRICKS_DATAQUALITY_DATA_QUALITY_CREATE_MONITOR`

Tool to create a data quality monitor for a Unity Catalog Delta table. Use when you need to set up monitoring for table quality, track data drift, or monitor ML model inference logs. Supports snapshot, time series, and inference log monitoring types. Only one monitor can be created per table.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `schedule` | object | No | Schedule configuration for auto-refresh. |
| `snapshot` | object | No | Snapshot table monitoring configuration. Use empty object for snapshot monitoring. |
| `assets_dir` | string | Yes | Absolute path to directory for storing monitoring assets (e.g., /Workspace/Users/{user}/databricks_lakehouse_monitoring/{catalog}.{schema}.{table_name}) |
| `table_name` | string | Yes | Fully qualified Unity Catalog table name in format {catalog}.{schema}.{table_name}. Must be a managed or external Delta table. |
| `time_series` | object | No | Time series monitoring configuration. |
| `warehouse_id` | string | No | Warehouse ID for dashboard creation |
| `inference_log` | object | No | Inference log monitoring configuration for ML models. |
| `notifications` | object | No | Notification settings for quality monitor. |
| `slicing_exprs` | array | No | Column expressions for data slicing |
| `custom_metrics` | array | No | Custom metric definitions |
| `output_schema_name` | string | Yes | Schema for output metric tables in format {catalog}.{schema} |
| `baseline_table_name` | string | No | Baseline table for drift metrics. Must match monitored table schema. |
| `skip_builtin_dashboard` | boolean | No | Skip default dashboard creation |
| `data_classification_config` | object | No | Data classification configuration. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List DBFS Directory Contents

**Slug:** `DATABRICKS_DBFS_LIST`

Tool to list the contents of a directory or get details of a file in DBFS. Use when you need to browse DBFS directories or check file details. Note: Recommended for directories with less than 10,000 files due to ~60 second timeout limitation. Throws RESOURCE_DOES_NOT_EXIST error if path doesn't exist.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `path` | string | Yes | The absolute DBFS path of the file or directory to list. Must be an absolute path starting with '/' |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Databricks Cluster

**Slug:** `DATABRICKS_DELETE_CLUSTER`

Tool to terminate a Databricks Spark cluster asynchronously. Use when you need to stop and remove a cluster. The cluster is terminated asynchronously and removed after completion. Cluster configuration is retained for 30 days after termination.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `cluster_id` | string | Yes | The unique identifier of the cluster to terminate. This can be found in the Databricks workspace URL when viewing a cluster configuration |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Edit Databricks Cluster

**Slug:** `DATABRICKS_EDIT_CLUSTER`

Tool to edit an existing Databricks cluster configuration. Use when you need to modify cluster settings such as size, Spark version, node types, or cloud-specific attributes. The cluster must be in RUNNING or TERMINATED state. If updated while RUNNING, it will restart to apply changes.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `autoscale` | object | No | Autoscaling configuration for cluster workers. |
| `policy_id` | string | No | Cluster policy ID to validate configuration and preset defaults |
| `cluster_id` | string | Yes | Unique identifier of the cluster to edit. The cluster must be in RUNNING or TERMINATED state |
| `spark_conf` | object | No | Custom Spark configuration key-value pairs |
| `custom_tags` | object | No | User-defined resource tags (max 45 tags). Serverless clusters should have ResourceClass=Serverless tag |
| `num_workers` | integer | No | Number of worker nodes. Total Spark nodes = num_workers + 1 (driver). Use 0 for single-node clusters. Required unless using autoscale |
| `cluster_name` | string | No | User-defined name for the cluster; does not need to be unique |
| `docker_image` | object | No | Docker container configuration. |
| `init_scripts` | array | No | Initialization scripts executed sequentially at cluster startup |
| `node_type_id` | string | No | Node type for worker nodes (e.g., 'i3.xlarge' for AWS, 'Standard_DS3_v2' for Azure, 'n2-highmem-4' for GCP). Not needed if instance_pool_id is specified |
| `spark_version` | string | Yes | Runtime version of the cluster (e.g., '11.3.x-scala2.12', '12.2.x-scala2.12', '13.x-scala2.12') |
| `workload_type` | object | No | Workload type restrictions for cluster usage. |
| `aws_attributes` | object | No | AWS-specific cluster configuration. |
| `gcp_attributes` | object | No | GCP-specific cluster configuration. |
| `is_single_node` | boolean | No | When true, automatically configures single-node settings (sets num_workers to 0) |
| `runtime_engine` | string ("STANDARD" | "PHOTON") | No | Runtime engine type. Defaults to STANDARD unless spark_version contains '-photon-' |
| `spark_env_vars` | object | No | Environment variables for Spark clusters |
| `ssh_public_keys` | array | No | SSH public keys to add to each Spark node (max 10 keys). Login with username 'ubuntu' on port 2200 |
| `azure_attributes` | object | No | Azure-specific cluster configuration. |
| `cluster_log_conf` | object | No | Cluster log delivery configuration. |
| `instance_pool_id` | string | No | Instance pool identifier for worker nodes (alternative to node_type_id) |
| `single_user_name` | string | No | Username to assign to an interactive cluster (required for SINGLE_USER mode) |
| `idempotency_token` | string | No | Token to guarantee idempotency (max 64 characters) |
| `data_security_mode` | string ("NONE" | "SINGLE_USER" | "USER_ISOLATION" | "LEGACY_TABLE_ACL" | "LEGACY_PASSTHROUGH" | "LEGACY_SINGLE_USER") | No | Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode |
| `cluster_mount_infos` | array | No | Remote storage mount configurations |
| `driver_node_type_id` | string | No | Node type for the Spark driver. Defaults to node_type_id if not specified |
| `enable_elastic_disk` | boolean | No | Enables autoscaling local storage instead of fixed EBS volumes |
| `autotermination_minutes` | integer | No | Automatically terminate after inactivity. Valid range: 10-10000 minutes |
| `driver_instance_pool_id` | string | No | Instance pool identifier for driver node |
| `apply_policy_default_values` | boolean | No | Whether to apply policy default values |
| `enable_local_disk_encryption` | boolean | No | Enables encryption for locally attached disk data |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Add Block to DBFS Stream

**Slug:** `DATABRICKS_FILES_DBFS_ADD_BLOCK`

Tool to append a block of data to an open DBFS stream. Use when uploading large files in chunks as part of the DBFS streaming upload workflow: 1) create a stream handle, 2) add blocks, 3) close the stream. Each block is limited to 1 MB of base64-encoded data.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | The base64-encoded data to append to the stream. This has a limit of 1 MB. If the block of data exceeds 1 MB, throws MAX_BLOCK_SIZE_EXCEEDED exception |
| `handle` | integer | Yes | The handle on an open stream. Must reference an existing open stream obtained from the create endpoint. If the handle does not exist, throws RESOURCE_DOES_NOT_EXIST exception |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create DBFS File Stream

**Slug:** `DATABRICKS_FILES_DBFS_CREATE`

Tool to open a stream to write to a DBFS file and returns a handle. Use when uploading files to DBFS using the streaming workflow: 1) create a stream handle, 2) add blocks of data, 3) close the stream. The returned handle has a 10-minute idle timeout and must be used within that period.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `path` | string | Yes | The path of the new file. The path should be the absolute DBFS path (e.g., /path/to/file.txt). If a file or directory already exists on the given path and overwrite is set to false, throws RESOURCE_ALREADY_EXISTS exception |
| `overwrite` | boolean | No | The flag that specifies whether to overwrite existing file/files. If a file already exists and overwrite is false, throws RESOURCE_ALREADY_EXISTS exception. Defaults to false |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete DBFS File or Directory

**Slug:** `DATABRICKS_FILES_DBFS_DELETE`

Tool to delete a file or directory from DBFS. Use when you need to remove files or directories from the Databricks File System. For large deletions (>10K files), use dbutils.fs in a cluster context instead of the REST API. Operation may return 503 PARTIAL_DELETE for large deletions and should be re-invoked until completion.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `path` | string | Yes | The absolute DBFS path of the file or directory to delete |
| `recursive` | boolean | No | Whether to recursively delete the directory's contents. Deleting empty directories can be done without this flag. Defaults to false. Throws IO_ERROR if attempting to delete a non-empty directory without this flag enabled |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get DBFS File Status

**Slug:** `DATABRICKS_FILES_DBFS_GET_STATUS`

Tool to get the information of a file or directory in DBFS. Use when you need to check if a file or directory exists, retrieve its size, type, or last modification time. Throws RESOURCE_DOES_NOT_EXIST exception if the file or directory does not exist.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `path` | string | Yes | The absolute DBFS path of the file or directory. The path should be the absolute DBFS path (e.g., '/FileStore/wheels/test_packages') |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Move DBFS File or Directory

**Slug:** `DATABRICKS_FILES_DBFS_MOVE`

Tool to move a file or directory from one location to another within DBFS. Use when you need to relocate files or directories in Databricks File System. Recursively moves all files if source is a directory. Not recommended for large-scale operations (>10k files) as it may timeout after ~60 seconds.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `source_path` | string | Yes | The source path of the file or directory to move. Must be an absolute DBFS path (e.g., /path/to/source/file). Throws RESOURCE_DOES_NOT_EXIST if the source path does not exist |
| `destination_path` | string | Yes | The destination path where the file or directory should be moved. Must be an absolute DBFS path (e.g., /path/to/destination/file). Throws RESOURCE_ALREADY_EXISTS if a file or directory already exists at the destination path |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Read DBFS File Contents

**Slug:** `DATABRICKS_FILES_DBFS_READ`

Tool to read the contents of a file from DBFS. Returns base64-encoded file data with maximum read size of 1 MB per request. Use when you need to retrieve file contents from Databricks File System. Throws RESOURCE_DOES_NOT_EXIST if file does not exist, INVALID_PARAMETER_VALUE if path is a directory, MAX_READ_SIZE_EXCEEDED if read length exceeds 1 MB.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `path` | string | Yes | The absolute DBFS path of the file to read. Must be a valid file path (not a directory). Throws RESOURCE_DOES_NOT_EXIST if file does not exist, INVALID_PARAMETER_VALUE if path is a directory |
| `length` | integer | No | The number of bytes to read starting from the offset. Maximum limit: 1 MB (1,048,576 bytes). Default: 0.5 MB (524,288 bytes). Must be non-negative. Throws MAX_READ_SIZE_EXCEEDED if read length exceeds 1 MB |
| `offset` | integer | No | The offset to read from in bytes. Default is 0. Must be non-negative |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get All Library Statuses

**Slug:** `DATABRICKS_GET_ALL_LIBRARY_STATUSES`

Tool to retrieve status of all libraries across all Databricks clusters. Use when you need to check library installation status on all clusters, including libraries set to be installed on all clusters via the API or libraries UI. Returns detailed status information for each library on each cluster.

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Cluster Information

**Slug:** `DATABRICKS_GET_CLUSTER`

Tool to retrieve comprehensive metadata and configuration details for a Databricks cluster by its unique identifier. Use when you need to check cluster state, configuration, resources, or operational details. Returns cluster information including state, compute configuration, cloud-specific settings, and resource allocations.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `cluster_id` | string | Yes | The unique identifier of the cluster to retrieve information about |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get User by ID

**Slug:** `DATABRICKS_GET_USER_BY_ID`

Tool to retrieve information for a specific user in Databricks workspace by their ID. Use when you need to get complete user details including identity, contact information, group memberships, roles, and entitlements. Implements the SCIM 2.0 protocol standard for retrieving User resources.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `user_id` | string | Yes | The unique Databricks user identifier. This is the SCIM user ID that can be obtained from user listing or creation operations. |
| `attributes` | string | No | Comma-separated list of attribute names to return, overriding the default set of attributes returned. |
| `excludedAttributes` | string | No | Comma-separated list of attribute names to exclude from the response. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update IAM Account Access Control Rule Set

**Slug:** `DATABRICKS_IAM_ACCOUNT_ACCESS_CONTROL_UPDATE_RULE_SET`

Tool to update account-level access control rule set for service principals, groups, or budget policies. Use when you need to replace the entire set of access control rules for a resource. This is a PUT operation that replaces all existing roles - to preserve existing roles, they must be included in the grant_rules array.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | Unique identifier of the rule set in format 'accounts/{account_id}/servicePrincipals/{application_id}/ruleSets/default' or 'accounts/{account_id}/groups/{group_id}/ruleSets/default' or 'accounts/{account_id}/budgetPolicies/{budget_policy_id}/ruleSets/default' |
| `rule_set` | object | Yes | Container object for the access control rules. This is a PUT method that replaces the entire rule set - all existing roles are overwritten. To preserve existing roles, they must be included in the grant_rules array |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get IAM Account Group V2

**Slug:** `DATABRICKS_IAM_ACCOUNT_GROUPS_V2_GET`

Tool to retrieve a specific group resource by its unique identifier from a Databricks account using SCIM v2 protocol. Use when you need to get complete group details including members, roles, and entitlements.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier for the account-level group resource |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Current User Information

**Slug:** `DATABRICKS_IAM_CURRENT_USER_ME`

Tool to retrieve details about the currently authenticated user or service principal making the API request. Use when you need to get information about the current user's identity, groups, roles, and entitlements within the Databricks workspace.

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create IAM Group V2

**Slug:** `DATABRICKS_IAM_GROUPS_V2_CREATE`

Tool to create a new group in Databricks workspace using SCIM v2 protocol. Use when you need to create a new security group with a unique display name, optionally with initial members, entitlements, and roles.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `roles` | array | No | Roles assigned to the group. Corresponds to AWS instance profile/ARN role. |
| `groups` | array | No | Parent groups this group belongs to. |
| `members` | array | No | Array of member objects to add to the group. Each object contains a 'value' field with the user or group ID. |
| `externalId` | string | No | External identifier from an identity provider for SCIM synchronization. |
| `displayName` | string | Yes | Human-readable group name. Must be unique within the workspace. |
| `entitlements` | array | No | Permissions/entitlements assigned to the group (e.g., workspace-access, allow-cluster-create, databricks-sql-access). |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete IAM Group V2

**Slug:** `DATABRICKS_IAM_GROUPS_V2_DELETE`

Tool to delete a group from Databricks workspace using SCIM v2 protocol. Use when you need to permanently remove a security group. Requires appropriate permissions to delete the group.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | Unique identifier of the group to delete. This is the Databricks-generated group ID |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Workspace IAM Group V2

**Slug:** `DATABRICKS_IAM_GROUPS_V2_GET`

Tool to retrieve details of a specific group by ID from Databricks workspace using SCIM v2 protocol. Use when you need to get complete group information including members, roles, entitlements, and metadata.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | Unique identifier for the group in the Databricks workspace |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Patch IAM Group V2

**Slug:** `DATABRICKS_IAM_GROUPS_V2_PATCH`

Tool to partially update a Databricks workspace group using SCIM 2.0 PATCH operations. Use when you need to modify group attributes like displayName, add/remove members, or update entitlements/roles. All operations in a single request are atomic.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | Unique ID of the group in the Databricks workspace to update |
| `Operations` | array | Yes | Array of one or more PATCH operation objects. All operations in a single PATCH request are atomic - if any operation fails, the entire request is rolled back. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update IAM Group V2

**Slug:** `DATABRICKS_IAM_GROUPS_V2_UPDATE`

Tool to update an existing group in Databricks workspace using SCIM v2 protocol. This performs a complete replacement of the group resource. Use when you need to update group properties, members, entitlements, or roles. For partial updates, consider using PATCH instead.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | Unique identifier of the group to update. This is the Databricks-generated group ID |
| `roles` | array | No | Roles assigned to the group. Corresponds to AWS instance profile/ARN role. This replaces the entire role list. |
| `groups` | array | No | Parent groups this group belongs to. This replaces the entire parent group list. |
| `members` | array | No | Array of member objects. This replaces the entire member list. Each object contains a 'value' field with the user or group ID. |
| `externalId` | string | No | External identifier from an identity provider for SCIM synchronization. |
| `displayName` | string | Yes | Human-readable group name for the group. Must be unique within the workspace. |
| `entitlements` | array | No | Permissions/entitlements assigned to the group. This replaces the entire entitlement list. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get IAM Permissions

**Slug:** `DATABRICKS_IAM_PERMISSIONS_GET`

Tool to retrieve IAM permissions for a Databricks workspace object. Use when you need to check who has access to a specific resource and their permission levels. Returns the access control list (ACL) including user, group, and service principal permissions with inheritance information.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `object_id` | string | Yes | The unique identifier for the specific resource instance. This is typically a 32-character string or a resource_id depending on the object type |
| `object_type` | string | Yes | The category of workspace object. Acceptable values include: alerts, alertsv2, authorization, clusters, cluster-policies, dashboards, dbsql-dashboards, directories, experiments, files, genie, instance-pools, jobs, notebooks, pipelines, queries, registered-models, repos, serving-endpoints, warehouses |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get IAM Permission Levels

**Slug:** `DATABRICKS_IAM_PERMISSIONS_GET_PERMISSION_LEVELS`

Tool to retrieve available permission levels for a Databricks workspace object. Use when you need to understand what permission levels can be assigned to users or groups for a specific object type. Returns permission levels like CAN_READ, CAN_RUN, CAN_EDIT, CAN_MANAGE with their descriptions. Available levels vary by object type.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `object_id` | string | Yes | The unique identifier of the workspace object. Format varies by object type (e.g., numeric for notebooks, UUID-like for warehouses and dashboards). |
| `object_type` | string ("alerts" | "authorization" | "clusters" | "cluster-policies" | "dashboards" | "dbsql-dashboards" | "directories" | "experiments" | "files" | "genie" | "instance-pools" | "jobs" | "notebooks" | "pipelines" | "queries" | "registered-models" | "repos" | "serving-endpoints" | "warehouses") | Yes | The type of the workspace object. Defines the category of the object for which permission levels are being retrieved. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Set IAM Permissions

**Slug:** `DATABRICKS_IAM_PERMISSIONS_SET`

Tool to set IAM permissions for a Databricks workspace object, replacing all existing permissions. Use when you need to configure complete access control for a resource. This operation replaces the entire access control list - existing permissions are overwritten. Admin permissions on the admins group cannot be removed.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `object_id` | string | Yes | The unique identifier for the specific resource instance |
| `object_type` | string | Yes | The type of workspace object. Valid values: alerts, authorization, clusters, cluster-policies, dashboards, dbsql-dashboards, directories, experiments, files, instance-pools, jobs, notebooks, pipelines, queries, registered-models, repos, serving-endpoints, warehouses |
| `access_control_list` | array | Yes | Array of access control entries. This replaces ALL existing permissions on the object. Each entry must specify exactly one of user_name, group_name, or service_principal_name. If empty, deletes all direct permissions on the object |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update IAM Permissions

**Slug:** `DATABRICKS_IAM_PERMISSIONS_UPDATE`

Tool to incrementally update permissions on Databricks workspace objects including dashboards, jobs, clusters, warehouses, notebooks, and more. Use when you need to modify specific permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `object_id` | string | Yes | Unique identifier of the workspace object. For dashboards, this is a 32-character resource identifier. For other objects like warehouses, jobs, or clusters, use their respective unique identifiers |
| `object_type` | string | Yes | Type of the workspace object to update permissions for. Examples: 'dashboards', 'jobs', 'clusters', 'notebooks', 'warehouses', 'pipelines', 'serving-endpoints', 'cluster-policies', 'instance-pools', 'registered-models', 'repos', 'experiments', 'sql/alerts', 'sql/dashboards', 'sql/queries', 'sql/warehouses' |
| `access_control_list` | array | Yes | List of access control entries to update. This operation updates permissions incrementally, preserving existing permissions not included in the request. For each entry, exactly one of user_name, group_name, or service_principal_name must be specified |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Migrate Permissions

**Slug:** `DATABRICKS_IAM_PERM_MIGRATION_MIGRATE_PERMS`

Tool to migrate ACL permissions from workspace groups to account groups. Use when adopting Unity Catalog and migrating permissions from workspace-level groups to account-level groups. Primarily used by the Unity Catalog Migration (UCX) tool. Supports batch processing with configurable size limits.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `size` | integer | No | Maximum number of permissions to migrate. Allows for batch processing by limiting the number of permissions migrated in a single request. |
| `workspace_id` | integer | Yes | Workspace ID where the permission migration occurs. The identifier of the Databricks workspace containing the permissions to migrate. |
| `to_account_group_name` | string | Yes | Name of the account group permissions migrate to. The target account-level group that will receive the migrated permissions. |
| `from_workspace_group_name` | string | Yes | Name of the workspace group permissions migrate from. The source workspace-level group whose permissions will be migrated. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create IAM Service Principal V2

**Slug:** `DATABRICKS_IAM_SERVICE_PRINCIPALS_V2_CREATE`

Tool to create a new service principal in Databricks workspace using SCIM v2 protocol. Use when you need to create a service principal that already exists in the Databricks account. Required for identity-federated workspaces where you must specify a valid UUID applicationId.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `roles` | array | No | Roles assigned to the service principal for role-based access control. |
| `active` | boolean | No | Activation status of the service principal. Defaults to true if not specified. |
| `groups` | array | No | Group memberships for the service principal. Used for assigning the service principal to specific groups. |
| `externalId` | string | No | An identifier from an external system for correlation with identity providers. |
| `displayName` | string | No | User-friendly display name for the service principal. May be auto-generated if not provided in some configurations. |
| `entitlements` | array | No | Workspace entitlements and permissions assigned to the service principal (e.g., allow-cluster-create, can-manage, workspace-access, databricks-sql-access). |
| `applicationId` | string | Yes | The unique UUID identifier for the service principal (also known as Client ID). The service principal must already exist in the Databricks account before it can be created in a workspace. Required for identity-federated workspaces. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete IAM Service Principal V2

**Slug:** `DATABRICKS_IAM_SERVICE_PRINCIPALS_V2_DELETE`

Tool to delete a service principal from Databricks workspace using SCIM v2 protocol. Use when you need to permanently remove a service principal and revoke its access to the workspace. The operation is idempotent - subsequent DELETE requests to the same ID will return 404 Not Found.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | Unique identifier for the service principal in the Databricks workspace. This is the service principal's application ID (also referred to as the SCIM ID or canonical unique identifier) |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get IAM Service Principal V2

**Slug:** `DATABRICKS_IAM_SERVICE_PRINCIPALS_V2_GET`

Tool to retrieve details of a specific service principal by ID from Databricks workspace using SCIM v2 protocol. Use when you need to get complete service principal information including groups, roles, entitlements, and metadata.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | Unique identifier for the service principal in the Databricks workspace |
| `attributes` | string | No | Comma-separated list of attribute names to return in the response |
| `excludedAttributes` | string | No | Comma-separated list of attribute names to exclude from the response |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Patch IAM Service Principal V2

**Slug:** `DATABRICKS_IAM_SERVICE_PRINCIPALS_V2_PATCH`

Tool to partially update a service principal using SCIM 2.0 PATCH operations. Use when you need to modify service principal attributes like active status, displayName, groups, entitlements, or roles without replacing the entire resource. All operations in a single request are atomic.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | Unique identifier for the service principal in the Databricks workspace to update |
| `Operations` | array | Yes | Array of one or more PATCH operation objects. All operations in a single PATCH request are atomic - if any operation fails, the entire request is rolled back. Common operations include activating/deactivating service principals, updating display names, managing group memberships, adding/removing entitlements, and managing roles. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update IAM Service Principal V2

**Slug:** `DATABRICKS_IAM_SERVICE_PRINCIPALS_V2_UPDATE`

Tool to update an existing service principal in Databricks workspace using SCIM v2 protocol. This performs a complete replacement of the service principal resource (PUT operation). Use when you need to update service principal properties, group memberships, entitlements, or roles. Note: applicationId and id are immutable fields.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier of the service principal to update. This is the Databricks workspace-level ID, not the applicationId. |
| `roles` | array | No | Roles assigned to the service principal for role-based access control. This replaces the entire role list. |
| `active` | boolean | No | Activation status of the service principal. Set to false to deactivate without deleting. |
| `groups` | array | No | Group memberships for the service principal. This replaces the entire group membership list. |
| `externalId` | string | No | External identifier from an identity provider for SCIM synchronization. |
| `displayName` | string | No | User-friendly display name for the service principal. IMPORTANT: In identity-federated workspaces (Azure AD, AWS IAM, etc.), this field is read-only and managed by the identity provider. Changes to this field will be ignored by the API. For non-federated workspaces, you can update this field. |
| `entitlements` | array | No | Workspace entitlements and permissions assigned to the service principal. This replaces the entire entitlement list. |
| `applicationId` | string | Yes | The unique UUID identifier for the service principal (also known as Client ID). This field is immutable and cannot be changed after creation. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create IAM User V2

**Slug:** `DATABRICKS_IAM_USERS_V2_CREATE`

Tool to create a new user in Databricks workspace using SCIM v2 protocol. Use when you need to provision a new user account with a unique userName (email), optionally with display name, activation status, group memberships, entitlements, and roles.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | object | No | SCIM name object for user names. |
| `roles` | array | No | Array of roles to assign to the user |
| `active` | boolean | No | Account activation status (default: true) |
| `emails` | array | No | Array of email objects for the user |
| `groups` | array | No | Array of group IDs to assign the user to |
| `userName` | string | Yes | Unique identifier for the user account, typically the user's email address |
| `externalId` | string | No | External system reference identifier (reserved for future use) |
| `displayName` | string | No | String representing concatenation of given and family names |
| `entitlements` | array | No | Array of entitlements to assign to the user |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete IAM User V2

**Slug:** `DATABRICKS_IAM_USERS_V2_DELETE`

Tool to delete a user from Databricks workspace using SCIM v2 protocol. Use when you need to inactivate a user and revoke their access to the workspace. Note that users are automatically purged 30 days after deletion if they do not own or belong to any workspace. Applications or scripts using tokens generated by the deleted user will no longer be able to access Databricks APIs.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | Unique identifier for the user in the Databricks workspace. This is the user's Databricks ID |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Patch IAM User V2

**Slug:** `DATABRICKS_IAM_USERS_V2_PATCH`

Tool to partially update a user using SCIM 2.0 PATCH operations. Use when you need to modify user attributes like active status, displayName, userName, name fields, emails, groups, entitlements, or roles without replacing the entire resource. All operations in a single request are atomic.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | Unique identifier for the user in the Databricks workspace to update |
| `Operations` | array | Yes | Array of one or more PATCH operation objects. All operations in a single PATCH request are atomic - if any operation fails, the entire request is rolled back. Common operations include activating/deactivating users, updating display names or userNames, managing group memberships, adding/removing entitlements, and managing roles. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update IAM User V2

**Slug:** `DATABRICKS_IAM_USERS_V2_UPDATE`

Tool to update a user in Databricks workspace using SCIM v2 protocol. This performs a complete replacement (PUT) of the user resource. Use when you need to update user properties including userName, displayName, active status, emails, entitlements, or roles. IMPORTANT LIMITATIONS: - Groups cannot be updated via workspace-level API. Groups for account-level users are managed at the account level only. - For partial updates (updating specific fields without replacing the entire resource), use the PATCH operation instead. - The 'groups' parameter is included for response compatibility but will be ignored in requests to avoid API errors.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | Databricks user ID to update |
| `name` | object | No | SCIM name object for user names. |
| `roles` | array | No | Array of roles to assign to the user. This replaces the entire role list. |
| `active` | boolean | No | Account activation status (true = active, false = inactive) |
| `emails` | array | No | Array of email objects for the user |
| `groups` | array | No | DEPRECATED: Groups cannot be updated via workspace SCIM API. This parameter is kept for schema compatibility but will be ignored. Group memberships for account users must be managed at the account level. |
| `userName` | string | Yes | Unique identifier for the user account, typically the user's email address. This field is required for complete replacement updates. |
| `externalId` | string | No | External system reference identifier (reserved for future use) |
| `displayName` | string | No | String representing concatenation of given and family names |
| `entitlements` | array | No | Array of entitlements to assign to the user. This replaces the entire entitlement list. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Workspace Access Detail

**Slug:** `DATABRICKS_IAMV2_WORKSPACE_IAM_V2_GET_WORKSPACE_ACCESS`

Retrieves workspace access details for a specific principal (user, service principal, or group) in Databricks. Returns information about the principal's workspace access including their principal ID, workspace ID, account ID, principal type (USER/SERVICE_PRINCIPAL/GROUP), access type (DIRECT/INHERITED), and access status (ACTIVE/INACTIVE). Use this to verify workspace access assignments and understand how identities are granted access to the workspace.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The principal ID (user ID, service principal ID, or group ID) to retrieve workspace access details for. This is the unique numeric identifier of the identity whose workspace access you want to check. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Job Compliance for Policy

**Slug:** `DATABRICKS_JOB_COMPLIANCE_LISTCOMPLIANCE`

Tool to retrieve policy compliance status of all jobs using a given cluster policy. Use when you need to identify jobs that are out of compliance because the policy was updated after the job was last edited. Jobs are non-compliant when their job clusters no longer meet the requirements of the updated policy.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `page_size` | integer | No | Number of results to return per page. If not specified, the default page size is used. |
| `policy_id` | string | Yes | The cluster policy identifier to check compliance against. This field is required. |
| `page_token` | string | No | Token for pagination to retrieve next page of results. Use the next_page_token from the previous response. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Job Run By ID

**Slug:** `DATABRICKS_JOB_RUN_BY_ID`

Tool to retrieve metadata of a single Databricks job run by ID. Use when you need to get detailed information about a specific job run including state, timing, and cluster configuration. Runs are automatically removed after 60 days.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `run_id` | integer | Yes | The canonical identifier of the run for which to retrieve the metadata. This field is required. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Cancel All Databricks Job Runs

**Slug:** `DATABRICKS_JOBS_JOBS_CANCEL_ALL_RUNS`

Cancel all active runs of a Databricks job asynchronously. Requires either job_id or all_queued_runs=true. With job_id: cancels all active runs of the specified job. With all_queued_runs=true (no job_id): cancels all queued runs in the workspace. Cancellation is asynchronous and does not prevent new runs from starting. Use with caution when cancelling workspace-wide runs.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `job_id` | integer | No | The canonical identifier of the job to cancel all runs of. If provided, cancels all active runs of this specific job. Either this field or all_queued_runs must be provided. |
| `all_queued_runs` | boolean | No | When true, cancels all queued runs in the entire workspace (if no job_id provided) or all queued runs of the specified job (if job_id provided). Either this field (set to true) or job_id must be provided. Setting to false has no effect. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Cancel Databricks Job Run

**Slug:** `DATABRICKS_JOBS_JOBS_CANCEL_RUN`

Tool to cancel a Databricks job run asynchronously. Use when you need to terminate a running job. The run will be terminated shortly after the request completes. If the run is already in a terminal state, this is a no-op.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `run_id` | integer | Yes | The canonical identifier of the run to cancel. This field is required to specify which job run should be cancelled. The cancellation happens asynchronously. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Databricks Job Run

**Slug:** `DATABRICKS_JOBS_JOBS_DELETE_RUN`

Tool to delete a non-active Databricks job run. Use when you need to remove a job run from the workspace. The run must be in a non-active state; attempting to delete an active run will return an error. Runs are automatically removed after 60 days.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `run_id` | integer | Yes | The canonical identifier of the run to delete. This is the ID of the non-active job run that should be deleted. Note that runs are automatically removed after 60 days. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Databricks Job Details

**Slug:** `DATABRICKS_JOBS_JOBS_GET`

Tool to retrieve detailed information about a single Databricks job. Use when you need to get comprehensive job configuration including tasks, schedules, notifications, and cluster settings. For jobs with more than 100 tasks or job clusters, use the page_token parameter to paginate through results.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `job_id` | integer | Yes | The canonical identifier of the job to retrieve information about. This field is required. |
| `page_token` | string | No | Token for pagination. Use the next_page_token value returned from a previous request to retrieve the next page of results. Required when the tasks or job_clusters arrays exceed 100 elements. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Job Permission Levels

**Slug:** `DATABRICKS_JOBS_JOBS_GET_PERMISSION_LEVELS`

Tool to retrieve available permission levels for a Databricks job. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific job. Returns permission levels like CAN_VIEW, CAN_MANAGE_RUN, CAN_MANAGE, and IS_OWNER with their descriptions.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `job_id` | string | Yes | The unique identifier of the job for which to retrieve available permission levels |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Set Databricks Job Permissions

**Slug:** `DATABRICKS_JOBS_JOBS_SET_PERMISSIONS`

Tool to set permissions for a Databricks job, completely replacing all existing permissions. Use when you need to configure access control for a job. This operation replaces ALL existing permissions with the provided list. To remove all permissions except the owner, provide an empty array. The job must have exactly one owner (cannot be a group).

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `job_id` | string | Yes | The unique identifier of the job for which to set permissions |
| `access_control_list` | array | Yes | Array of access control entries. This replaces ALL existing permissions. To remove all permissions except the owner, provide an empty array. For each entry, only one of user_name, group_name, or service_principal_name should be specified. The job must have exactly one owner |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Job Policy Compliance

**Slug:** `DATABRICKS_JOBS_POLICY_COMPL_FOR_JOBS_GET_COMPL`

Tool to retrieve policy compliance status for a specific job. Use when you need to check whether a job meets the requirements of its assigned policies and identify any policy violations. Jobs could be out of compliance if a policy they use was updated after the job was last edited and some of its job clusters no longer comply with their updated policies.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `job_id` | integer | Yes | The ID of the job whose compliance status you are requesting |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Unity Catalogs

**Slug:** `DATABRICKS_LIST_CATALOGS`

Tool to retrieve a list of all catalogs in the Unity Catalog metastore. Use when you need to discover available catalogs based on user permissions. If the caller is the metastore admin, all catalogs will be retrieved. Otherwise, only catalogs owned by the caller or for which the caller has the USE_CATALOG privilege will be retrieved.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `page_token` | string | No | Opaque pagination token to retrieve the next page based on previous query. Obtained from the next_page_token field in the previous response. |
| `max_results` | integer | No | Maximum number of catalogs to return. If not specified or negative, attempts to return all catalogs (unpaginated, being deprecated). Use 0 for server default page size, or a positive number for specific page size. |
| `include_browse` | boolean | No | Whether to include catalogs in the response for which the principal can only access selective metadata through the BROWSE privilege. Default is false. |
| `include_unbound` | boolean | No | Whether to include catalogs that are not bound to the current workspace. Only effective if the user has permission to update the catalog-workspace binding. Default is false. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Clusters

**Slug:** `DATABRICKS_LIST_CLUSTERS`

Tool to list all pinned, active, and recently terminated Databricks clusters. Use when you need to retrieve cluster information, monitor cluster status, or get an overview of available compute resources. Returns clusters terminated within the last 30 days along with currently active clusters. Supports filtering by state, source, and policy, with pagination for large result sets.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `sort_by` | object | No | Sorting criteria for cluster listing. |
| `filter_by` | object | No | Filter criteria for cluster listing. |
| `page_size` | integer | No | Maximum number of results per page for pagination |
| `page_token` | string | No | Token for navigating to next or previous pages in paginated results |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Workspace Groups

**Slug:** `DATABRICKS_LIST_GROUPS`

Tool to list all groups in the Databricks workspace using SCIM v2 protocol. Use when you need to retrieve all groups or search for specific groups using filters and pagination.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `count` | integer | No | Maximum number of results to return per page. Negative values treated as zero |
| `filter` | string | No | Query parameter to filter results. Supports operators: equals (eq), contains (co), starts with (sw), not equals (ne). Logical operators 'and' and 'or' can form simple expressions. Example: 'displayName sw "eng"' filters groups whose display name starts with 'eng' |
| `sortBy` | string | No | Attribute name to sort results by. For multi-valued attributes, uses primary or first value. Complex attributes require sub-attribute notation |
| `sortOrder` | string ("ascending" | "descending") | No | Sort direction, either 'ascending' or 'descending'. Defaults to ascending |
| `attributes` | string | No | Comma-separated list of attribute names to include in the response. Overrides default attribute set |
| `startIndex` | integer | No | The 1-based index position of the first result to return. First item is number 1. Used for pagination |
| `excludedAttributes` | string | No | Comma-separated list of attribute names to exclude from the response |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Instance Pools

**Slug:** `DATABRICKS_LIST_INSTANCE_POOLS`

Tool to retrieve a list of all active instance pools in the Databricks workspace with their statistics and configuration. Use when you need to get an overview of all available instance pools.

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List All Databricks Jobs (API 2.0)

**Slug:** `DATABRICKS_LIST_JOBS`

Tool to list all jobs in the Databricks workspace using API 2.0. Use when you need to retrieve all jobs without pagination. Note: API 2.0 does not support pagination or filtering. For pagination support, use the API 2.2 endpoint instead.

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Members of a Security Group

**Slug:** `DATABRICKS_LIST_MEMBERS_OF_A_SECURITY_GROUP`

Tool to retrieve all members (users and nested groups) of a Databricks security group. Use when you need to see who belongs to a specific group for access control auditing or management. This method is non-recursive and does not expand nested group memberships.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `group_name` | string | Yes | The name of the group whose members you want to list. Administrator privileges are required to invoke this operation. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Model Serving Endpoints

**Slug:** `DATABRICKS_LIST_MODEL_SERVING_ENDPOINTS`

Lists all model serving endpoints in the Databricks workspace. This includes both foundation model endpoints (e.g., GPT, Claude, Llama) and custom model endpoints. Returns comprehensive information for each endpoint: - Endpoint name, ID, type, and task (chat, completions, embeddings) - Current state and readiness status - Served entities/models configuration with pricing details - Endpoint capabilities (function calling, image input, long context) - Traffic routing configuration - Permission level and metadata Use this to discover available AI models, check endpoint status, or gather configuration details for making API calls.

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Delta Live Tables Pipelines

**Slug:** `DATABRICKS_LIST_PIPELINES`

Tool to list Delta Live Tables pipelines in the workspace. Use when you need to retrieve a paginated list of pipelines with summary information. The pipeline specification field is not returned by this endpoint - only summary information is provided. For complete pipeline details, use the get pipeline endpoint.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `filter` | string | No | Narrows results based on criteria using SQL-like syntax. Supports filtering by notebook path or name patterns with wildcards. Composite filters are unsupported. |
| `order_by` | array | No | Ordering specification for results. Supported fields are 'id' and 'name'. Default ordering is 'id asc'. |
| `page_token` | string | No | Opaque pagination token from a previous call to fetch the next page of results. Cannot be used together with filter parameter. |
| `max_results` | integer | No | Maximum number of pipelines to return per page. Default is 25; maximum allowed value is 100. An error occurs if exceeded. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Repos

**Slug:** `DATABRICKS_LIST_REPOS`

Tool to list Git repos that the calling user has Manage permissions on. Use when you need to retrieve all available repos in the workspace. Supports pagination and filtering by path prefix.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `path_prefix` | string | No | Filters repos that have paths starting with the given path prefix. If not provided or when provided an effectively empty prefix (/ or /Workspace), Git folders (repos) from /Workspace/Repos will be served. |
| `next_page_token` | string | No | Token used to get the next page of results. If not specified, returns the first page of results as well as a next page token if there are more results. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Databricks Job Runs

**Slug:** `DATABRICKS_LIST_RUNS`

Tool to list Databricks job runs in descending order by start time. Use when you need to retrieve job runs with optional filtering by job ID, run status, and type. Supports pagination via offset and limit parameters. Runs are automatically removed after 60 days.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `limit` | integer | No | The number of runs to return. This value should be greater than 0 and less than 1000. The default value is 20. If 0 is specified, uses maximum limit. |
| `job_id` | integer | No | The job for which to list runs. If omitted, the Jobs service will list runs from all jobs. |
| `offset` | integer | No | The offset of the first run to return, relative to the most recent run. Used for pagination. |
| `run_type` | string | No | The type of runs to return. Supported values: JOB_RUN, WORKFLOW_RUN, SUBMIT_RUN. |
| `active_only` | boolean | No | If true, only active runs are included in the results (PENDING, RUNNING, TERMINATING states); otherwise, lists both active and completed runs. Cannot be true when completed_only is true. |
| `expand_tasks` | boolean | No | Whether to include task and cluster details in the response. When true, provides comprehensive information about each run's tasks and associated clusters. |
| `start_time_to` | integer | No | Show runs that started at or before this value (epoch milliseconds). Use with start_time_from to filter runs by time range. |
| `completed_only` | boolean | No | If true, only completed runs are included in the results. Cannot be true when active_only is true. |
| `start_time_from` | integer | No | Show runs that started at or after this value (epoch milliseconds). Use with start_time_to to filter runs by time range. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Catalog Schemas

**Slug:** `DATABRICKS_LIST_SCHEMAS`

Tool to retrieve all schemas in a specified catalog from Unity Catalog. Use when you need to discover available schemas within a catalog based on user permissions. If the caller is the metastore admin or owner of the parent catalog, all schemas will be retrieved. Otherwise, only schemas owned by the caller or for which the caller has the USE_SCHEMA privilege will be retrieved.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `page_token` | string | No | Opaque pagination token to retrieve the next page based on previous query. Obtained from the next_page_token field in the previous response. |
| `max_results` | integer | No | Maximum number of schemas to return. If not specified, all schemas are returned. Use 0 for server default page size (recommended for pagination), positive number for specific page size. Negative values return an error. |
| `catalog_name` | string | Yes | Parent catalog for schemas of interest. The name of the catalog to list schemas from. |
| `include_browse` | boolean | No | Whether to include schemas in the response for which the principal can only access selective metadata through the BROWSE privilege. When enabled, schemas where the user has limited metadata access will be included in results. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Secrets

**Slug:** `DATABRICKS_LIST_SECRETS`

Tool to list all secret keys stored in a Databricks secret scope. Use when you need to retrieve metadata about secrets in a scope (does not return secret values). Requires READ permission on the scope.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `scope` | string | Yes | The name of the scope to list secrets within |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Secret Scopes

**Slug:** `DATABRICKS_LIST_SECRET_SCOPES`

Tool to list all secret scopes available in the Databricks workspace. Use when you need to retrieve all secret scopes including their names, backend types (DATABRICKS or AZURE_KEYVAULT), and Key Vault metadata for Azure-backed scopes.

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List SQL Warehouses

**Slug:** `DATABRICKS_LIST_SQL_WAREHOUSES`

Tool to list all SQL warehouses in the Databricks workspace. Use when you need to retrieve information about available SQL compute resources for running SQL commands. Returns the full list of SQL warehouses the user has access to, including their configuration, state, and connection details.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `page_size` | integer | No | The maximum number of warehouses to return per page |
| `page_token` | string | No | A page token received from a previous ListWarehouses call. Provide this to retrieve the subsequent page; otherwise the first page will be retrieved. When paginating, all other parameters provided to ListWarehouses must match the call that provided the page token |
| `run_as_user_id` | integer | No | Service Principal which will be used to fetch the list of endpoints. If not specified, SQL Gateway will use the user from the session header |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Catalog Tables

**Slug:** `DATABRICKS_LIST_TABLES`

Tool to list all tables in a Unity Catalog schema with pagination support. Use when you need to retrieve tables from a specific catalog and schema combination. The API is paginated by default - continue reading pages using next_page_token until it's absent to ensure all results are retrieved.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `page_token` | string | No | Opaque pagination token to go to the next page based on previous query. Continue reading pages until next_page_token is absent in the response |
| `max_results` | integer | No | Maximum number of tables to return. If not set, all tables are returned (not recommended). When set to a value greater than 0, the page length is the minimum of this value and a server configured value; when set to 0, the page length is set to a server configured value (recommended) |
| `schema_name` | string | Yes | Parent schema of tables to list |
| `catalog_name` | string | Yes | Name of parent catalog for tables of interest |
| `omit_columns` | boolean | No | Whether to omit the columns of the table from the response |
| `omit_username` | boolean | No | Whether to omit the username of the table (e.g. owner, updated_by, created_by) |
| `include_browse` | boolean | No | Whether to include tables in the response for which the principal can only access selective metadata through the BROWSE privilege |
| `omit_properties` | boolean | No | Whether to omit the properties of the table from the response |
| `include_manifest_capabilities` | boolean | No | Whether to include a manifest containing table capabilities in the response. Use this to retrieve tables that support credential vending (marked HAS_DIRECT_EXTERNAL_ENGINE_READ_SUPPORT or HAS_DIRECT_EXTERNAL_ENGINE_WRITE_SUPPORT) |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Tokens

**Slug:** `DATABRICKS_LIST_TOKENS`

Tool to list all valid personal access tokens (PATs) for a user-workspace pair. Use when you need to retrieve all tokens associated with the authenticated user in the current workspace. Note that each PAT is valid for only one workspace, and Databricks automatically revokes PATs that haven't been used for 90 days.

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Users

**Slug:** `DATABRICKS_LIST_USERS`

Tool to list all users in a Databricks workspace using SCIM 2.0 protocol. Use when you need to retrieve user identities and their attributes. Supports filtering, pagination, and sorting.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `count` | integer | No | Maximum number of user records to return per request. Databricks returns at most 100 users at a time. |
| `filter` | string | No | SCIM-compliant filter expression to search users by specific criteria. Supports operators: eq (equals), co (contains), sw (starts with), ne (not equals). Can use logical operators 'and' and 'or'. Example: 'userName eq "user@example.com"' |
| `sortBy` | string | No | Attribute name to sort results by. Supports multi-part paths such as userName, name.givenName, and emails. |
| `sortOrder` | string ("ascending" | "descending") | No | Sort direction. Valid values: 'ascending' or 'descending'. |
| `attributes` | string | No | Comma-separated list of specific attributes to include in response. Allows selective field retrieval. |
| `startIndex` | integer | No | Starting position for paginated results using 1-based indexing. First item is number 1. Used with count parameter for pagination. |
| `excludedAttributes` | string | No | Comma-separated list of attributes to exclude from response. Inverse of attributes parameter. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Vector Search Endpoints

**Slug:** `DATABRICKS_LIST_VECTOR_SEARCH_ENDPOINTS`

Tool to list all vector search endpoints in the Databricks workspace. Use when you need to retrieve information about vector search endpoints which represent compute resources hosting vector search indexes. Supports pagination for handling large result sets.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `page_token` | string | No | Token used for pagination to retrieve subsequent pages of results. When present in the response as next_page_token, pass it in the next request to get the next page. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Marketplace Consumer Listing

**Slug:** `DATABRICKS_MARKETPLACE_CONSUMER_LISTINGS_GET`

Tool to retrieve a published listing from Databricks Marketplace that consumer has access to. Use when you need to get detailed information about a specific marketplace listing by its ID. Requires Unity Catalog permissions to access marketplace assets.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | Unique identifier of the marketplace listing to retrieve. This is a UUID-formatted string that identifies a specific published listing |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Marketplace Consumer Provider

**Slug:** `DATABRICKS_MARKETPLACE_CONSUMER_PROVIDERS_GET`

Tool to retrieve information about a specific provider in the Databricks Marketplace with visible listings. Use when you need to get provider details including contact information, description, and metadata.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The provider identifier to retrieve. This is a unique identifier for the marketplace provider |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Marketplace Provider Listing

**Slug:** `DATABRICKS_MARKETPLACE_PROVIDER_LISTINGS_CREATE`

Tool to create a new listing in Databricks Marketplace for data providers. Use when you need to publish data products, datasets, models, or notebooks to the marketplace. Requires a listing object with summary information (name and listing_type). For free and instantly available data products, a share must be included during creation.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `listing` | object | Yes | Complete listing configuration including summary, optional share, and detailed metadata |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Marketplace Provider Listing

**Slug:** `DATABRICKS_MARKETPLACE_PROVIDER_LISTINGS_GET`

Tool to retrieve a specific marketplace provider listing by its identifier. Use when you need to get detailed information about a published or draft listing including metadata, configuration, and assets.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier of the listing to retrieve |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Marketplace Consumer Installation

**Slug:** `DATABRICKS_MKTPLACE_CONSUMER_INSTALLATIONS_CREATE`

Tool to create a marketplace consumer installation for Databricks Marketplace listings. Use when you need to install data products, datasets, notebooks, models, or other marketplace offerings into a workspace. Requires acceptance of consumer terms and the listing ID to proceed with installation.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The listing ID to install from |
| `share_name` | string | No | The name of the share |
| `repo_detail` | object | No | Configuration for git repository installations. |
| `catalog_name` | string | No | The name of the catalog for installation |
| `recipient_type` | string ("DELTA_SHARING_RECIPIENT_TYPE_DATABRICKS" | "DELTA_SHARING_RECIPIENT_TYPE_OPEN") | No | Type of Delta Sharing recipient |
| `accepted_consumer_terms` | object | No | Consumer terms acceptance details. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Marketplace Consumer Installation

**Slug:** `DATABRICKS_MKTPLACE_CONSUMER_INSTALLATIONS_DELETE`

Tool to uninstall a Databricks Marketplace installation. Use when you need to remove an installed data product from your workspace. When an installation is deleted, the shared catalog is removed from the workspace. Requires CREATE CATALOG and USE PROVIDER permissions on the Unity Catalog metastore, or metastore admin role.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `listing_id` | string | Yes | The marketplace listing identifier to uninstall from |
| `installation_id` | string | Yes | The installation identifier to delete |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Marketplace Consumer Installation

**Slug:** `DATABRICKS_MKTPLACE_CONSUMER_INSTALLATIONS_UPDATE`

Tool to update marketplace consumer installation fields and rotate tokens for marketplace listings. Use when you need to modify installation attributes or refresh access credentials. The token will be rotated if the rotate_token flag is true.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | No | Unique installation identifier |
| `status` | string ("INSTALLED" | "FAILED") | No | Installation state |
| `tokens` | array | No | Collection of token records |
| `repo_name` | string | No | Git repository name for repo-based installs |
| `repo_path` | string | No | Repository file path reference |
| `listing_id` | string | Yes | The listing ID associated with the installation |
| `share_name` | string | No | Delta Share name for data assets |
| `catalog_name` | string | No | Target catalog for installation |
| `installed_on` | integer | No | Timestamp of installation in epoch milliseconds |
| `listing_name` | string | No | Human-readable listing name |
| `rotate_token` | boolean | No | Flag to rotate the token during the update operation. The token will be forcibly rotated if this flag is true and the tokenInfo field is empty. |
| `token_detail` | object | No | Details about access tokens. |
| `error_message` | string | No | Error details if installation failed |
| `recipient_type` | string ("DELTA_SHARING_RECIPIENT_TYPE_DATABRICKS" | "DELTA_SHARING_RECIPIENT_TYPE_OPEN") | No | Recipient classification (Databricks or Open) |
| `installation_id` | string | Yes | The installation ID to update |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Batch Get Marketplace Consumer Listings

**Slug:** `DATABRICKS_MKTPLACE_CONSUMER_LISTINGS_BATCH_GET`

Retrieve multiple published marketplace listings by their IDs in a single API call. Use this action when you need to fetch details for multiple listings efficiently instead of making individual GET requests. The action accepts up to 50 listing IDs per request and returns complete listing information including summaries, detailed descriptions, asset types, pricing, and metadata. Only listings that exist and are accessible to the caller are returned; invalid or inaccessible IDs are silently filtered out. Returns an empty array if no listings match or if the IDs parameter is omitted. Ideal for bulk operations like displaying multiple listings, syncing catalog data, or checking listing availability.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `ids` | array | No | List of UUID-formatted listing identifiers to retrieve. Maximum of 50 IDs per request. Returns only listings the caller has permission to access. Omit this parameter or pass empty list to return no results |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Marketplace Consumer Personalization Requests

**Slug:** `DATABRICKS_MKTPLACE_CONSUMER_PERSONALIZATION_REQUESTS_GET`

Tool to retrieve personalization requests for a specific marketplace listing. Use when you need to check the status of customization or commercial transaction requests for a listing. Each consumer can make at most one personalization request per listing.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `listing_id` | string | Yes | The unique identifier of the marketplace listing for which to retrieve personalization requests |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Batch Get Marketplace Consumer Providers

**Slug:** `DATABRICKS_MKTPLACE_CONSUMER_PROVIDERS_BATCH_GET`

Retrieve multiple marketplace provider details in a single batch API call. Returns information about Databricks Marketplace providers that have at least one publicly visible listing. Use this tool when you need to get details for multiple providers efficiently (up to 50 providers per request). Provider information includes contact details, descriptions, branding assets, and policy links.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `ids` | array | No | A list of provider IDs (UUIDs) to retrieve. Maximum 50 IDs can be specified per request. If not provided, returns an empty list |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Provider Analytics Dashboard

**Slug:** `DATABRICKS_MKTPLACE_PROVIDER_ANALYTICS_DASHBOARDS_CREATE`

Tool to create a provider analytics dashboard for monitoring Databricks Marketplace listing metrics. Use when you need to establish analytics tracking for listing views, requests, installs, conversion rates, and consumer information. Requires Marketplace admin role and system tables to be enabled in the metastore.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `version` | integer | No | The version of the dashboard template to use. Should equal the latest version if updating an existing dashboard |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Provider Analytics Dashboard

**Slug:** `DATABRICKS_MKTPLACE_PROVIDER_ANALYTICS_DASHBOARDS_GET`

Tool to retrieve provider analytics dashboard information for monitoring consumer usage metrics. Use when you need to access the dashboard ID to view marketplace listing performance including views, requests, installs, and conversion rates.

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Latest Provider Analytics Dashboard Version

**Slug:** `DATABRICKS_MKTPLACE_PROVIDER_ANALYTICS_DASH_GET_LATEST`

Tool to retrieve the latest logical version of the provider analytics dashboard template. Use when you need to get the current dashboard template version for monitoring consumer usage metrics including listing views, requests, and installs.

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Listing From Exchange

**Slug:** `DATABRICKS_MKTPLACE_PROVIDER_EXCHANGES_DELETE_LISTING`

Tool to remove the association between a marketplace exchange and a listing. Use when you need to disassociate an exchange from a provider listing. This removes the listing from the private exchange, and it will no longer be shared with the curated set of customers in that exchange.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The identifier for the exchange-listing association to remove. This ID represents the specific association between an exchange and a listing that should be disassociated |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create ML Experiment

**Slug:** `DATABRICKS_ML_EXPERIMENTS_CREATE_EXPERIMENT`

Tool to create a new MLflow experiment for tracking machine learning runs and models. Use when you need to organize and track ML experiments within Databricks. Returns RESOURCE_ALREADY_EXISTS error if an experiment with the same name already exists.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | Experiment name. This field is required and must be unique within the workspace |
| `tags` | array | No | A collection of tags to set on the experiment. Maximum tag size: keys up to 250 bytes, values up to 5000 bytes. Limited to 20 tags per request |
| `artifact_location` | string | No | Location where all artifacts for the experiment are stored. If not provided, the remote server will select an appropriate default |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Logged Model

**Slug:** `DATABRICKS_ML_EXPERIMENTS_CREATE_LOGGED_MODEL`

Tool to create a new logged model in MLflow that ties together model metadata, parameters, metrics, and artifacts. Use when you need to create a LoggedModel object as part of the unified 'log + register' workflow introduced in MLflow 2.8. LoggedModel objects persist throughout a model's lifecycle and provide a centralized way to track model information.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | No | Name of the model. If not specified, the backend will generate one automatically |
| `tags` | array | No | LoggedModel tags as key-value pairs. Each tag consists of a key and a value |
| `params` | array | No | LoggedModel parameters as key-value pairs. Each parameter consists of a key and a value |
| `model_type` | string | No | The type of model, such as 'Agent', 'Classifier', 'LLM', or 'sklearn'. Used to search and compare related models |
| `experiment_id` | string | Yes | ID of the associated experiment. This field is required |
| `source_run_id` | string | No | Run ID of the run that created this model |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create MLflow Experiment Run

**Slug:** `DATABRICKS_ML_EXPERIMENTS_CREATE_RUN`

Tool to create a new MLflow run within an experiment for tracking machine learning execution. Use when starting a new ML training run, experiment execution, or data pipeline that needs parameter and metric tracking. Returns the created run with a unique run_id for subsequent metric and parameter logging.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `tags` | array | No | Additional metadata for the run as key-value pairs. Tags can be used to categorize, filter, and organize runs. |
| `user_id` | string | No | ID of the user executing the run. This field is deprecated as of MLflow 1.0 and should not be used in new implementations. |
| `run_name` | string | No | Name assigned to the run. This is a human-readable identifier for the run. |
| `start_time` | integer | No | Unix timestamp in milliseconds of when the run started. If not provided, the current time will be used. |
| `experiment_id` | string | No | The associated experiment identifier. This is the ID of the experiment under which the run will be created. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete ML Experiment

**Slug:** `DATABRICKS_ML_EXPERIMENTS_DELETE_EXPERIMENT`

Tool to delete an MLflow experiment and associated metadata, runs, metrics, params, and tags. Use when you need to remove an experiment from Databricks. If the experiment uses FileStore, artifacts associated with the experiment are also deleted.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `experiment_id` | string | Yes | ID of the associated experiment. This field is required |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Logged Model

**Slug:** `DATABRICKS_ML_EXPERIMENTS_DELETE_LOGGED_MODEL`

Tool to delete a logged model from MLflow tracking. Use when you need to permanently remove a LoggedModel from the tracking server. The deletion is permanent and cannot be undone. LoggedModels track a model's lifecycle across different training and evaluation runs.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `model_id` | string | Yes | The unique identifier of the logged model to delete. This is the model_id that is assigned when a model is logged using mlflow.<model-flavor>.log_model() |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Logged Model Tag

**Slug:** `DATABRICKS_ML_EXPERIMENTS_DELETE_LOGGED_MODEL_TAG`

Tool to delete a tag from a logged model in MLflow. Use when you need to remove metadata from a LoggedModel object. This operation is irreversible and permanently removes the tag from the logged model. Part of MLflow 3's logged model management capabilities.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `key` | string | Yes | The tag key to delete. Maximum size is 255 bytes. This operation is irreversible. |
| `model_id` | string | Yes | The unique identifier of the logged model from which to remove the tag. This is the model_id that is assigned when a model is logged using mlflow.<model-flavor>.log_model() |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete ML Experiment Run

**Slug:** `DATABRICKS_ML_EXPERIMENTS_DELETE_RUN`

Tool to mark an MLflow run for deletion in ML experiments. Use when you need to remove a specific run from Databricks. This is a soft delete operation - the run is marked for deletion rather than immediately removed and can be restored unless permanently deleted.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `run_id` | string | Yes | ID of the run to delete. This field is required |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete ML Experiment Runs

**Slug:** `DATABRICKS_ML_EXPERIMENTS_DELETE_RUNS`

Tool to bulk delete runs in an ML experiment created before a specified timestamp. Use when you need to clean up old experiment runs. Only runs created prior to or at the specified timestamp are deleted. The maximum number of runs that can be deleted in one operation is 10000.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `max_runs` | integer | No | A positive integer that indicates the maximum number of runs to delete. The maximum allowed value for max_runs is 10000. If not specified, max_runs defaults to 10000 |
| `experiment_id` | string | Yes | The ID of the experiment containing the runs to delete. This field is required |
| `max_timestamp_millis` | integer | Yes | The maximum creation timestamp in milliseconds since the UNIX epoch for deleting runs. Only runs created prior to or at this timestamp are deleted. This field is required |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete ML Experiment Run Tag

**Slug:** `DATABRICKS_ML_EXPERIMENTS_DELETE_TAG`

Tool to delete a tag from an MLflow experiment run. Use when you need to remove run metadata. This operation is irreversible and permanently removes the tag from the run.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `key` | string | Yes | Name of the tag to delete. Maximum size is 255 bytes. This operation is irreversible. |
| `run_id` | string | Yes | ID of the run that the tag was logged under. This is the unique identifier for the MLflow experiment run. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Finalize Logged Model

**Slug:** `DATABRICKS_ML_EXPERIMENTS_FINALIZE_LOGGED_MODEL`

Tool to finalize a logged model in MLflow by updating its status to READY or FAILED. Use when custom model preparation logic is complete and you need to mark the model as ready for use or indicate that upload failed. This is part of the experimental logged models feature introduced in MLflow 2.8+.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `status` | string ("LOGGED_MODEL_READY" | "LOGGED_MODEL_UPLOAD_FAILED") | Yes | Final status to set on the model. Use LOGGED_MODEL_READY when model files are completely uploaded and ready for use, or LOGGED_MODEL_UPLOAD_FAILED if an error occurred during model upload |
| `model_id` | string | Yes | The unique identifier of the logged model to finalize. This ID is assigned when the model is initially created |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get ML Experiment By Name

**Slug:** `DATABRICKS_ML_EXPERIMENTS_GET_BY_NAME`

Tool to retrieve MLflow experiment metadata by name. Use when you need to get experiment details using the experiment name. Returns deleted experiments but prefers active ones if both exist with the same name. Throws RESOURCE_DOES_NOT_EXIST if no matching experiment exists.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `experiment_name` | string | Yes | Name of the associated experiment. This field is required |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get ML Experiment

**Slug:** `DATABRICKS_ML_EXPERIMENTS_GET_EXPERIMENT`

Tool to retrieve metadata for an MLflow experiment by ID. Use when you need to get experiment details including name, artifact location, lifecycle stage, and tags. Works on both active and deleted experiments.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `experiment_id` | string | Yes | ID of the associated experiment. This field is required |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Logged Model

**Slug:** `DATABRICKS_ML_EXPERIMENTS_GET_LOGGED_MODEL`

Tool to fetch logged model metadata by unique ID. Use when you need to retrieve a LoggedModel object representing a model logged to an MLflow Experiment. Returns comprehensive model information including metrics, parameters, tags, and artifact details.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier (model_id) of the logged model to retrieve. Must be provided. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get ML Experiment Permission Levels

**Slug:** `DATABRICKS_ML_EXPERIMENTS_GET_PERMISSION_LEVELS`

Tool to retrieve available permission levels for a Databricks ML experiment. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific experiment. Returns permission levels (CAN_READ, CAN_EDIT, CAN_MANAGE) with their descriptions.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `experiment_id` | string | Yes | The numeric string identifier of the MLflow experiment |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get ML Experiment Permissions

**Slug:** `DATABRICKS_ML_EXPERIMENTS_GET_PERMISSIONS`

Tool to retrieve permissions for an MLflow experiment. Use when you need to check who has access to an experiment and their permission levels. Note that notebook experiments inherit permissions from their corresponding notebook, while workspace experiments have independent permissions.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `experiment_id` | string | Yes | The ID of the experiment for which to get permissions |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get MLflow Run

**Slug:** `DATABRICKS_ML_EXPERIMENTS_GET_RUN`

Tool to retrieve complete information about a specific MLflow run including metadata, metrics, parameters, tags, inputs, and outputs. Use when you need to get details of a run by its run_id. Returns the most recent metric values when multiple metrics with the same key exist.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `run_id` | string | Yes | ID of the run to fetch. Must be provided. Unique identifier for the run. |
| `run_uuid` | string | No | Deprecated parameter; use run_id instead. Will be removed in future MLflow versions. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Log Batch MLflow Data

**Slug:** `DATABRICKS_ML_EXPERIMENTS_LOG_BATCH`

Tool to log a batch of metrics, parameters, and tags for an MLflow run in a single request. Use when you need to efficiently log multiple metrics, params, or tags simultaneously. Items within each type are processed sequentially in the order specified. The combined total of all items across metrics, params, and tags cannot exceed 1000.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `tags` | array | No | Tags to log. Each tag includes key and value. Items are processed sequentially in order. Maximum 100 tags |
| `params` | array | No | Parameters to log. Each param includes key and value. Items are processed sequentially in order. Maximum 100 parameters |
| `run_id` | string | Yes | ID of the run to log under. This identifies the specific MLflow run for which metrics, params, and tags will be logged |
| `metrics` | array | No | Metrics to log. Each metric includes key, value, timestamp, and optional step. Items are processed sequentially in order |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Log MLflow Dataset Inputs

**Slug:** `DATABRICKS_ML_EXPERIMENTS_LOG_INPUTS`

Tool to log dataset inputs to an MLflow run for tracking data sources used during model development. Use when you need to track metadata about datasets used in ML experiment runs, including information about the dataset source, schema, and tags. Enables logging of dataset inputs to a run, allowing you to track data sources throughout the ML lifecycle.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `run_id` | string | Yes | ID of the run to log under. This field is required. The unique identifier for the MLflow run where dataset inputs will be logged |
| `datasets` | array | No | Collection of dataset inputs being logged to the run. Each entry contains dataset metadata and optional tags |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Log Logged Model Parameters

**Slug:** `DATABRICKS_ML_EXPERIMENTS_LOG_LOGGED_MODEL_PARAMS`

Tool to log parameters for a logged model in MLflow. Use when you need to attach hyperparameters or metadata to a LoggedModel object. A param can be logged only once for a logged model, and attempting to overwrite an existing param will result in an error. Available in MLflow 2.8+.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier (model_id) of the logged model to log params for. Must be provided |
| `params` | array | Yes | Parameters to attach to the logged model. Each parameter is a key-value pair where both key and value are strings |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Log MLflow Metric

**Slug:** `DATABRICKS_ML_EXPERIMENTS_LOG_METRIC`

Tool to log a metric for an MLflow run with timestamp. Use when you need to record ML model performance metrics like accuracy, loss, or custom evaluation metrics. Metrics can be logged multiple times with different timestamps and values are never overwritten - each log appends to the metric history for that key.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `key` | string | Yes | Name of the metric. May only contain alphanumerics, underscores (_), dashes (-), periods (.), spaces ( ), and slashes (/). Maximum length is 250 characters |
| `step` | integer | No | Integer training step (iteration) at which the metric was calculated |
| `value` | number | Yes | Double value of the metric being logged. Note that some special values such as +/- Infinity may be replaced by other values depending on the store |
| `run_id` | string | Yes | ID of the run under which to log the metric. Must be provided |
| `timestamp` | integer | Yes | Unix timestamp in milliseconds at the time metric was logged. Defaults to current system time if not provided |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Log MLflow Model

**Slug:** `DATABRICKS_ML_EXPERIMENTS_LOG_MODEL`

Tool to log a model artifact for an MLflow run (Experimental API). Use when you need to record model metadata including artifact paths, flavors, and versioning information for a training run. The model_json parameter should contain a complete MLmodel specification in JSON string format.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `run_id` | string | Yes | ID of the run to log the model under. This field is required |
| `model_json` | string | Yes | MLmodel file in JSON format as a string. Contains model metadata including artifact_path, flavors, mlflow_version, model_uuid, and utc_time_created. This field is required |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Log MLflow Dataset Outputs

**Slug:** `DATABRICKS_ML_EXPERIMENTS_LOG_OUTPUTS`

Tool to log dataset outputs from an MLflow run for tracking data generated during model development. Use when you need to track metadata about datasets produced by ML experiment runs, including information about predictions, model outputs, or generated data. Enables logging of dataset outputs to a run, allowing you to track generated data throughout the ML lifecycle.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `run_id` | string | Yes | ID of the run to log under. This field is required. The unique identifier for the MLflow run where dataset outputs will be logged |
| `datasets` | array | No | Collection of dataset outputs being logged to the run. Each entry contains dataset metadata and optional tags |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Log MLflow Parameter

**Slug:** `DATABRICKS_ML_EXPERIMENTS_LOG_PARAM`

Tool to log a parameter for an MLflow run as a key-value pair. Use when you need to record hyperparameters or constant values for ML model training or ETL pipelines. Parameters can only be logged once per run and cannot be changed after logging. Logging identical parameters is idempotent.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `key` | string | Yes | Name of the param. Maximum size is 255 bytes. This field is required |
| `value` | string | Yes | String value of the param being logged. Maximum size is 6000 bytes |
| `run_id` | string | Yes | ID of the run under which to log the param. Must be provided |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Restore ML Experiment

**Slug:** `DATABRICKS_ML_EXPERIMENTS_RESTORE_EXPERIMENT`

Tool to restore a deleted MLflow experiment and its associated metadata, runs, metrics, params, and tags. Use when you need to recover a previously deleted experiment from Databricks. If the experiment uses FileStore, underlying artifacts are also restored.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `experiment_id` | string | Yes | ID of the associated experiment. This field is required |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Restore ML Experiment Run

**Slug:** `DATABRICKS_ML_EXPERIMENTS_RESTORE_RUN`

Tool to restore a deleted MLflow run and its associated metadata, runs, metrics, params, and tags. Use when you need to recover a previously deleted run from Databricks ML experiments. The operation cannot restore runs that were permanently deleted.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `run_id` | string | Yes | ID of the run to restore. This field is required |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Restore ML Experiment Runs

**Slug:** `DATABRICKS_ML_EXPERIMENTS_RESTORE_RUNS`

Tool to bulk restore runs in an ML experiment that were deleted at or after a specified timestamp. Use when you need to recover multiple deleted experiment runs. Only runs deleted at or after the specified timestamp are restored. The maximum number of runs that can be restored in one operation is 10000.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `max_runs` | integer | No | An optional positive integer indicating the maximum number of runs to restore. The maximum allowed value for max_runs is 10000. If not specified, defaults to 10000 |
| `experiment_id` | string | Yes | The ID of the experiment containing the runs to restore. This field is required |
| `min_timestamp_millis` | integer | Yes | The minimum deletion timestamp in milliseconds since the UNIX epoch for restoring runs. Only runs deleted at or after this timestamp are restored. This field is required |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Search Logged Models

**Slug:** `DATABRICKS_ML_EXPERIMENTS_SEARCH_LOGGED_MODELS`

Tool to search for logged models in MLflow experiments based on various criteria. Use when you need to find models that match specific metrics, parameters, tags, or attributes using SQL-like filter expressions. Supports pagination, ordering results, and filtering by datasets.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `filter` | string | No | A filter expression over logged model info and data. Supports AND'ing binary operations. Example: 'params.alpha < 0.3 AND metrics.accuracy > 0.9'. Can filter by metrics (metrics.<name>), params (params.<name>), and model attributes |
| `datasets` | array | No | List of datasets on which to apply the metrics filter clauses. When specified with a metrics filter like 'metrics.accuracy > 0.9', returns logged models meeting the criteria on the specified dataset |
| `order_by` | array | No | The list of columns for ordering the results. Can order by metrics, params, or model attributes |
| `page_token` | string | No | The token indicating the page of logged models to fetch for pagination. Obtained from the next_page_token in a previous response |
| `max_results` | integer | No | The maximum number of logged models to return. The maximum limit is 50 |
| `experiment_ids` | array | No | The IDs of the experiments in which to search for logged models. Note: While optional in the schema, this parameter is effectively required by the Databricks API - requests without it will return a 400 error. Recommended to always provide at least one experiment ID |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Set ML Experiment Tag

**Slug:** `DATABRICKS_ML_EXPERIMENTS_SET_EXPERIMENT_TAG`

Tool to set a tag on an MLflow experiment. Use when you need to add or update experiment metadata. Experiment tags are metadata that can be updated at any time.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `key` | string | Yes | Name of the tag. Maximum size depends on storage backend. All storage backends are guaranteed to support key values up to 250 bytes in size. |
| `value` | string | Yes | String value of the tag being logged. Maximum size depends on storage backend. All storage backends are guaranteed to support key values up to 5000 bytes in size. |
| `experiment_id` | string | Yes | ID of the experiment under which to log the tag. Must be provided. This is the unique identifier for the MLflow experiment. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Set Logged Model Tags

**Slug:** `DATABRICKS_ML_EXPERIMENTS_SET_LOGGED_MODEL_TAGS`

Tool to set tags on a logged model in MLflow. Use when you need to add or update metadata tags on a LoggedModel object for organization and tracking. Tags are key-value pairs that can be used to search and filter logged models. Part of MLflow 3's logged model management capabilities.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `tags` | array | Yes | Array of tags to set on the logged model. Each tag consists of a key-value pair used for organizing and tracking models throughout their lifecycle |
| `model_id` | string | Yes | The unique identifier of the logged model to set tags on. This is the model_id that is assigned when a model is logged using mlflow.<model-flavor>.log_model() |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Set ML Experiment Permissions

**Slug:** `DATABRICKS_ML_EXPERIMENTS_SET_PERMISSIONS`

Tool to set permissions for an MLflow experiment, replacing all existing permissions. Use when you need to configure access control for an experiment. This operation replaces ALL existing permissions; for incremental updates, use the update permissions endpoint instead.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `experiment_id` | string | Yes | The ID of the MLflow experiment |
| `access_control_list` | array | Yes | Array of access control entries. This replaces ALL existing permissions. For each entry, exactly one of user_name, group_name, or service_principal_name must be specified |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Set MLflow Run Tag

**Slug:** `DATABRICKS_ML_EXPERIMENTS_SET_TAG`

Tool to set a tag on an MLflow run. Use when you need to add custom metadata to runs for filtering, searching, and organizing experiments. Tags with the same key can be overwritten by successive writes. Logging the same tag (key, value) is idempotent.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `key` | string | Yes | Name of the tag. Maximum size is 250 bytes. All storage backends are guaranteed to support key values up to 250 bytes in size. |
| `value` | string | Yes | String value of the tag being logged. Maximum size is 5000 bytes. All storage backends are guaranteed to support values up to 5000 bytes in size. |
| `run_id` | string | Yes | ID of the run under which to log the tag. This is the unique identifier for the MLflow run. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update ML Experiment

**Slug:** `DATABRICKS_ML_EXPERIMENTS_UPDATE_EXPERIMENT`

Tool to update MLflow experiment metadata, primarily for renaming experiments. Use when you need to rename an existing experiment. The new experiment name must be unique across all experiments in the workspace.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `new_name` | string | Yes | The experiment's name is changed to the new name. The new name must be unique across all experiments in the workspace. This field is required by Databricks |
| `experiment_id` | string | Yes | ID of the associated experiment. This field is required |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update ML Experiment Permissions

**Slug:** `DATABRICKS_ML_EXPERIMENTS_UPDATE_PERMISSIONS`

Tool to incrementally update permissions for an MLflow experiment. Use when you need to modify specific permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `experiment_id` | string | Yes | The ID of the MLflow experiment |
| `access_control_list` | array | Yes | Array of access control entries to update. This operation updates only the specified permissions incrementally, preserving existing permissions not included in the request. For each entry, exactly one of user_name, group_name, or service_principal_name must be specified |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update ML Experiment Run

**Slug:** `DATABRICKS_ML_EXPERIMENTS_UPDATE_RUN`

Tool to update MLflow run metadata including status, end time, and run name. Use when a run's status changes outside normal execution flow or when you need to rename a run. This endpoint allows you to modify a run's metadata after it has been created.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `run_id` | string | Yes | ID of the run to update. Must be provided. This is the unique identifier for the run you want to update. |
| `status` | string ("RUNNING" | "SCHEDULED" | "FINISHED" | "FAILED" | "KILLED") | No | Updated status of the run. Valid values: RUNNING (1), SCHEDULED (2), FINISHED (3), FAILED (4), KILLED (5). The RunStatus enum represents the execution state of the run. |
| `end_time` | integer | No | Unix timestamp in milliseconds of when the run ended. Represents the end time in milliseconds since the UNIX epoch. |
| `run_name` | string | No | Updated name of the run. Used to rename the run for better identification. |
| `run_uuid` | string | No | Deprecated field; use run_id instead. Maintained for backwards compatibility. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete ML Feature Engineering Kafka Config

**Slug:** `DATABRICKS_ML_FEATURE_ENG_DELETE_KAFKA_CONFIG`

Tool to delete a Kafka configuration from ML Feature Engineering. Use when you need to remove Kafka streaming source configurations. The deletion is permanent and cannot be undone. Kafka configurations define how features are streamed from Kafka sources.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier of the Kafka configuration to delete. This is the configuration ID that was assigned when the Kafka config was created |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create ML Feature Store Online Store

**Slug:** `DATABRICKS_ML_FEATURE_STORE_CREATE_ONLINE_STORE`

Tool to create a Databricks Online Feature Store for real-time feature serving. Use when you need to establish serverless infrastructure for low-latency access to feature data at scale. Requires Databricks Runtime 16.4 LTS ML or above, or serverless compute.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | Name of the online store to create. Must be unique and a valid identifier |
| `capacity` | string ("CU_1" | "CU_2" | "CU_4" | "CU_8") | Yes | Compute capacity units allocated to the online store. Each capacity unit allocates approximately 16GB of RAM with associated CPU and local SSD resources |
| `read_replica_count` | integer | No | Number of read replicas for high availability. Default is 0, maximum is 2 (contact Databricks to increase limit). Scaling to zero is not supported |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete ML Feature Store Online Store

**Slug:** `DATABRICKS_ML_FEATURE_STORE_DELETE_ONLINE_STORE`

Tool to delete an online store from ML Feature Store. Use when you need to remove online stores that provide low-latency feature serving infrastructure. The deletion is permanent and cannot be undone. Online stores are used for real-time feature retrieval in production ML serving.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the online store to delete. This is the name that was assigned when the online store was created (e.g., 'my_online_store', 'prod_store_001') |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create ML Forecasting Experiment

**Slug:** `DATABRICKS_ML_FORECASTING_CREATE_EXPERIMENT`

Tool to create a new AutoML forecasting experiment for time series prediction. Use when you need to automatically train and optimize forecasting models on time series data. The experiment will train multiple models and select the best one based on the primary metric.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `dataset` | string | Yes | The name of the dataset table in the format catalog.schema.table or schema.table for use in forecasting |
| `horizon` | integer | Yes | The number of time steps into the future to forecast. Must be a positive integer |
| `data_dir` | string | No | DBFS path where intermediate data will be stored during the AutoML run |
| `time_col` | string | Yes | The name of the column containing the time series timestamps |
| `frequency` | string | Yes | The frequency of the time series data. Common values: 'D' (daily), 'W' (weekly), 'M' (monthly), 'H' (hourly), 'T' or 'min' (minutely) |
| `target_col` | string | Yes | The name of the column containing the target values to forecast |
| `identity_col` | array | No | List of column names that identify individual time series in the dataset. Use when the dataset contains multiple time series to forecast independently |
| `primary_metric` | string | No | The primary metric to optimize during model training. Common values: 'smape' (Symmetric Mean Absolute Percentage Error), 'mse' (Mean Squared Error), 'mae' (Mean Absolute Error), 'rmse' (Root Mean Squared Error) |
| `experiment_name` | string | No | Name for the MLflow experiment that will be created to track this AutoML run |
| `timeout_minutes` | integer | No | Maximum time in minutes to run the AutoML forecasting experiment before timing out. If not specified, the experiment will run until completion |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete ML Feature Tag

**Slug:** `DATABRICKS_ML_MAT_FEATURES_DELETE_FEATURE_TAG`

Delete a metadata tag from a specific feature column in a Databricks ML Feature Store table. This operation removes the tag association from the feature but does not affect the actual feature data. The operation is idempotent - it succeeds even if the tag doesn't exist, making it safe to call multiple times. Use this when you need to remove metadata tags from feature columns, such as removing deprecated labels or cleaning up temporary tags.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `tag_name` | string | Yes | The name of the tag to delete. This operation is idempotent - deleting a non-existent tag will succeed without error |
| `feature_name` | string | Yes | The name of the specific feature column within the feature table that has the tag to be deleted |
| `feature_table_id` | string | Yes | The fully qualified name of the feature table in the format 'catalog.schema.table_name' (e.g., 'samples.nyctaxi.trips' or 'main.feature_store.user_features') |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get ML Feature Tag

**Slug:** `DATABRICKS_ML_MAT_FEATURES_GET_FEATURE_TAG`

Tool to retrieve a specific tag from a feature in a feature table in ML Feature Store. Use when you need to get metadata tag details from specific features. This operation returns the tag name and value associated with the feature.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `tag_name` | string | Yes | The name of the tag to retrieve from the feature |
| `feature_name` | string | Yes | The name of the feature within the feature table from which to retrieve the tag |
| `feature_table_id` | string | Yes | The unique identifier of the feature table. This is typically the full name of the feature table (e.g., 'samples.nyctaxi.trips') |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Set or Update ML Feature Tag

**Slug:** `DATABRICKS_ML_MAT_FEATURES_UPDATE_FEATURE_TAG`

Tool to set or update a tag on a feature in a feature table in ML Feature Store. Use when you need to add or modify metadata tags on specific features. If the tag already exists, it will be updated with the new value. If the tag doesn't exist, it will be created automatically. This operation is idempotent and can be used to ensure a tag has a specific value.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `value` | string | Yes | The new value to set for the tag |
| `tag_name` | string | Yes | The name of the tag to update on the feature |
| `feature_name` | string | Yes | The name of the feature within the feature table whose tag should be updated |
| `feature_table_id` | string | Yes | The unique identifier of the feature table. This is typically the full name of the feature table (e.g., 'samples.nyctaxi.trips') |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get ML Model Registry Permission Levels

**Slug:** `DATABRICKS_ML_MODEL_REGISTRY_GET_PERM_LEVELS`

Retrieves the list of available permission levels that can be assigned to users or groups for a Databricks ML registered model. This endpoint returns metadata about what permission levels are available (e.g., CAN_READ, CAN_EDIT, CAN_MANAGE, CAN_MANAGE_PRODUCTION_VERSIONS, CAN_MANAGE_STAGING_VERSIONS) with descriptions of what each level grants. Use this to understand permission options before setting or updating model permissions. The returned permission levels are standard across all registered models. Note: This returns the available permission level definitions, not the actual permissions currently assigned to the model. To view actual permissions, use the get permissions endpoint.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `registered_model_id` | string | Yes | The three-level (fully qualified) name of the registered model in Unity Catalog, formatted as catalog.schema.model_name. This identifier is used to reference the registered model across the workspace. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete OAuth2 Service Principal Secret

**Slug:** `DATABRICKS_OAUTH2_SERVICE_PRINCIPAL_SECRETS_DELETE`

Tool to delete an OAuth secret from a service principal at the account level. Use when you need to revoke OAuth credentials for service principal authentication. Once deleted, applications or scripts using tokens generated from that secret will no longer be able to access Databricks APIs.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `secret_id` | string | Yes | The unique identifier of the OAuth secret to delete. Once deleted, applications using tokens generated from this secret will no longer be able to access Databricks APIs |
| `service_principal_id` | string | Yes | The unique identifier of the service principal whose secret will be deleted |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create OAuth Service Principal Secret

**Slug:** `DATABRICKS_OAUTH2_SERVICE_PRINCIPAL_SECRETS_PROXY_CREATE`

Tool to create an OAuth secret for service principal authentication. Use when you need to obtain OAuth access tokens for accessing Databricks Accounts and Workspace APIs. A service principal can have up to five OAuth secrets, each valid for up to two years (730 days). The secret value is only shown once upon creation.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The service principal ID (SCIM ID, not application ID) |
| `lifetime` | string | No | The lifetime of the secret in seconds, formatted as 'NNNNs' (e.g., '63072000s' for 730 days). If not provided, defaults to 730 days (63072000 seconds). Maximum allowed is 730 days |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Databricks Pipeline

**Slug:** `DATABRICKS_PIPELINES_PIPELINES_DELETE`

Tool to delete a Databricks Delta Live Tables pipeline permanently and stop any active updates. Use when you need to remove a pipeline completely. If the pipeline publishes to Unity Catalog, deletion will cascade to all pipeline tables. This action cannot be easily undone without Databricks support assistance.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `pipeline_id` | string | Yes | The unique identifier of the pipeline to delete |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Pipeline Permissions

**Slug:** `DATABRICKS_PIPELINES_PIPELINES_GET_PERMISSIONS`

Tool to retrieve permissions for a Databricks Delta Live Tables pipeline. Use when you need to check who has access to a pipeline and their permission levels. Returns the complete permissions information including access control lists with user, group, and service principal permissions.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `pipeline_id` | string | Yes | The unique identifier of the pipeline for which to retrieve permissions |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Pipeline Permission Levels

**Slug:** `DATABRICKS_PIPELINES_PIPELINES_GET_PERM_LEVELS`

Tool to retrieve available permission levels for a Databricks Delta Live Tables pipeline. Use when you need to understand what permission levels can be assigned to users or groups for a specific pipeline. Returns permission levels like CAN_VIEW, CAN_RUN, CAN_MANAGE, and IS_OWNER with their descriptions.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `pipeline_id` | string | Yes | The unique identifier of the pipeline for which to retrieve available permission levels |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Pipeline Updates

**Slug:** `DATABRICKS_PIPELINES_PIPELINES_LIST_UPDATES`

Tool to retrieve a paginated list of updates for a Databricks Delta Live Tables pipeline. Use when you need to view the update history for a specific pipeline. Returns information about each update including state, creation time, and configuration details such as full refresh and table selection.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `page_token` | string | No | Opaque pagination token returned by a previous call to fetch the next page of results. |
| `max_results` | integer | No | Maximum number of updates to return in a single page. Use for pagination control. |
| `pipeline_id` | string | Yes | The unique identifier of the pipeline for which to retrieve updates |
| `until_update_id` | string | No | If present, returns updates until and including this update_id. Useful for getting updates up to a specific point. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Pipeline Permissions

**Slug:** `DATABRICKS_PIPELINES_PIPELINES_UPDATE_PERMISSIONS`

Tool to incrementally update permissions on a Databricks pipeline. Use when you need to modify specific permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request. Pipelines can inherit permissions from root object.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `pipeline_id` | string | Yes | Identifier for the pipeline requiring permission modifications |
| `access_control_list` | array | Yes | List of access control entries to update. This operation updates only the specified permissions incrementally, preserving existing permissions not included in the request. For each entry, exactly one of user_name, group_name, or service_principal_name must be specified |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Quality Monitor V2

**Slug:** `DATABRICKS_QUALITYMONITORV2_QUALITY_MONITOR_V2_CREATE`

Tool to create a quality monitor for Unity Catalog table. Use when you need to set up monitoring for data quality metrics, track drift over time, or monitor ML inference logs. Monitor creation is asynchronous; dashboard and metric tables take 8-20 minutes to complete. Exactly one monitor type (snapshot, time_series, or inference_log) must be specified.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `schedule` | object | No | Cron schedule configuration for automatic monitor refresh. |
| `snapshot` | object | No | Snapshot table monitoring configuration. |
| `assets_dir` | string | Yes | Absolute workspace path for storing monitoring assets including dashboard and metric tables (e.g., /Workspace/Users/{email}/monitoring/) |
| `table_name` | string | Yes | Full name of Unity Catalog table to monitor in format catalog.schema.table_name (case insensitive, spaces disallowed) |
| `time_series` | object | No | Time series monitoring configuration. |
| `warehouse_id` | string | No | Warehouse ID for dashboard query execution |
| `inference_log` | object | No | Inference log monitoring configuration for ML models. |
| `notifications` | object | No | Notification settings for quality monitor alerts. |
| `slicing_exprs` | array | No | Column expressions for slicing data to enable targeted analysis on data segments |
| `custom_metrics` | array | No | User-defined custom quality metrics for specialized monitoring needs |
| `output_schema_name` | string | Yes | Schema where output metric tables are created in 2-level format {catalog}.{schema} |
| `baseline_table_name` | string | No | Baseline table name for computing drift metrics in format catalog.schema.table_name |
| `skip_builtin_dashboard` | boolean | No | If true, skips creation of default quality metrics dashboard |
| `data_classification_config` | object | No | Data classification configuration for sensitive data detection. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Search MLflow Experiments

**Slug:** `DATABRICKS_SEARCH_MLFLOW_EXPERIMENTS`

Tool to search for MLflow experiments with filtering, ordering, and pagination support. Use when you need to find experiments based on name patterns, tags, or other criteria. Supports SQL-like filtering expressions and ordering by experiment attributes.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `filter` | string | No | SQL-like expression for filtering experiments using operators (=, !=, LIKE, ILIKE) against experiment attributes and tags. Example: "name LIKE 'test-%' AND tags.key = 'value'" |
| `order_by` | array | No | List of columns for ordering search results, which can include experiment name and id with an optional 'DESC' or 'ASC' annotation. Example: ['name DESC', 'experiment_id ASC'] |
| `view_type` | string ("ACTIVE_ONLY" | "DELETED_ONLY" | "ALL") | No | Qualifier for type of experiments to be returned. ACTIVE_ONLY returns only active experiments (default), DELETED_ONLY returns only deleted experiments, ALL returns all experiments regardless of deletion status. |
| `page_token` | string | No | Token indicating the page of experiments to fetch. Use the next_page_token from a previous response to retrieve the next page of results. |
| `max_results` | integer | No | Maximum number of experiments desired. Servers may select a desired default max_results value. If not specified, the server will choose a default limit. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Search MLflow Runs

**Slug:** `DATABRICKS_SEARCH_MLFLOW_RUNS`

Tool to search for MLflow runs with filtering, ordering, and pagination support. Use when you need to find runs based on metrics, parameters, tags, or other criteria. Supports complex filter expressions with operators like =, !=, >, >=, <, <= for metrics, params, and tags.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `filter` | string | No | Filter expression for searching runs. Supports params, metrics, and tags with operators: =, !=, >, >=, <, <=. Special characters in column names require double quotes or backticks. Example: "metrics.rmse < 1 and params.model_class = 'LogisticRegression'". Multiple expressions can be joined with AND (no OR support). |
| `order_by` | array | No | List of columns to be ordered by, including attributes, params, metrics, and tags with an optional DESC or ASC annotation, where ASC is the default. Example: ["params.input DESC", "metrics.rmse"]. When no ordering is specified, tiebreaks default to start_time DESC followed by run_id. |
| `page_token` | string | No | Pagination token for retrieving subsequent result pages. Use the next_page_token from a previous response to retrieve the next page of results. |
| `max_results` | integer | No | Maximum number of runs desired. If unspecified, defaults to 1000. All servers are guaranteed to support a max_results threshold of at least 50,000. |
| `run_view_type` | string ("ACTIVE_ONLY" | "DELETED_ONLY" | "ALL") | No | Whether to display only active, only deleted, or all runs. Defaults to ACTIVE_ONLY if not specified. ACTIVE_ONLY shows runs that are not marked for deletion, DELETED_ONLY shows runs marked for deletion, ALL shows all runs regardless of deletion status. |
| `experiment_ids` | array | Yes | List of experiment IDs to search over. At least one experiment ID must be provided. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Provisioned Throughput Endpoint

**Slug:** `DATABRICKS_SERVING_SERVING_ENDPOINTS_CREATE_PROV_THPUT`

Tool to create a provisioned throughput serving endpoint for AI models in Databricks. Use when you need to provision model units for production GenAI applications with guaranteed throughput. The endpoint name must be unique across the workspace and can consist of alphanumeric characters, dashes, and underscores. Returns a long-running operation that completes when the endpoint is ready.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the serving endpoint. Must be unique across the Databricks workspace. Can consist of alphanumeric characters, dashes, and underscores |
| `tags` | array | No | Tags to be attached to the serving endpoint and automatically propagated to billing logs |
| `config` | object | Yes | The core configuration of the PT serving endpoint |
| `ai_gateway` | object | No | The AI Gateway configuration for the serving endpoint. |
| `budget_policy_id` | string | No | The budget policy associated with the endpoint |
| `email_notifications` | object | No | Email notification settings. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Serving Endpoint

**Slug:** `DATABRICKS_SERVING_SERVING_ENDPOINTS_DELETE`

Tool to delete a model serving endpoint and all associated data. Use when you need to permanently remove an endpoint. Deletion is permanent and cannot be undone. This operation disables usage and deletes all data associated with the endpoint.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the serving endpoint to delete. This is the user-defined name assigned when the endpoint was created |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Serving Endpoint Details

**Slug:** `DATABRICKS_SERVING_SERVING_ENDPOINTS_GET`

Retrieves comprehensive details about a specific Databricks serving endpoint by name. Use this action to: - Get endpoint status and readiness state (READY, NOT_READY, UPDATE_FAILED) - View served model/entity configurations including foundation model details - Check endpoint capabilities (function calling, image input, long context, reasoning) - Retrieve pricing information (input/output token costs in DBUs) - Inspect AI Gateway settings (usage tracking, safety/PII guardrails) - Review traffic routing and workload configurations - Verify permission levels and access controls Returns detailed information including endpoint type (FOUNDATION_MODEL_API or CUSTOM_MODEL_API), task type (llm/v1/chat, llm/v1/completions, llm/v1/embeddings), model documentation links, and configuration versions.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the serving endpoint to retrieve |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Serving Endpoint OpenAPI Spec

**Slug:** `DATABRICKS_SERVING_SERVING_ENDPOINTS_GET_OPEN_API`

Tool to retrieve the OpenAPI 3.1.0 specification for a serving endpoint. Use when you need to understand the endpoint's schema, generate client code, or visualize the API structure. The endpoint must be in a READY state and the served model must have a model signature logged.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the serving endpoint to retrieve the OpenAPI specification for |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Serving Endpoint Permission Levels

**Slug:** `DATABRICKS_SERVING_SERVING_ENDPOINTS_GET_PERM_LEVELS`

Tool to retrieve available permission levels for a Databricks serving endpoint. Use when you need to understand what permission levels can be assigned to users or groups for access control. Returns permission levels like CAN_MANAGE, CAN_QUERY, and CAN_VIEW with their descriptions.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier of the serving endpoint |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Serving Endpoint Rate Limits

**Slug:** `DATABRICKS_SERVING_SERVING_ENDPOINTS_PUT`

Tool to update rate limits for a Databricks serving endpoint. Use when you need to control the number of API calls allowed within a time period. Note: This endpoint is deprecated; consider using AI Gateway for rate limit management instead.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the serving endpoint whose rate limits are being updated |
| `rate_limits` | array | No | List of rate limit configurations for the endpoint. Each configuration specifies calls allowed per renewal period |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Serving Endpoint AI Gateway

**Slug:** `DATABRICKS_SERVING_SERVING_ENDPOINTS_PUT_AI_GATEWAY`

Tool to update AI Gateway configuration of a Databricks serving endpoint. Use when you need to configure traffic fallback, AI guardrails, payload logging, rate limits, or usage tracking. Supports external model, provisioned throughput, and pay-per-token endpoints; agent endpoints currently only support inference tables.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The name of the serving endpoint whose AI Gateway is being updated |
| `guardrails` | object | No | Configuration for AI Guardrails to prevent unwanted data in requests and responses. |
| `rate_limits` | array | No | Configuration for rate limits to control endpoint traffic and prevent overuse |
| `fallback_config` | object | No | Configuration for traffic fallback which auto fallbacks to other served entities. |
| `usage_tracking_config` | object | No | Configuration to enable usage tracking using system tables. |
| `inference_table_config` | object | No | Configuration for payload logging using inference tables. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete AI/BI Dashboard Embedding Access Policy

**Slug:** `DATABRICKS_SETTINGS_AIBI_DASH_EMBEDDING_ACCESS_POLICY_DELETE`

Tool to delete AI/BI dashboard embedding access policy, reverting to default. Use when you need to remove the workspace-level policy for AI/BI published dashboard embedding. Upon deletion, the workspace reverts to the default setting (ALLOW_APPROVED_DOMAINS), conditionally permitting AI/BI dashboards to be embedded on approved domains.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Used for versioning and optimistic concurrency control. The etag helps prevent simultaneous writes from overwriting each other. It is strongly recommended to use the etag in a read-delete pattern: first make a GET request to retrieve the current etag, then pass it with the DELETE request to identify the specific rule set version being deleted. If the setting is updated or deleted concurrently, the DELETE fails with a 409 status code, requiring retry with the fresh etag from the 409 response |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get AI/BI Dashboard Embedding Access Policy

**Slug:** `DATABRICKS_SETTINGS_AIBI_DASH_EMBEDDING_ACCESS_POLICY_GET`

Tool to retrieve workspace AI/BI dashboard embedding access policy setting. Use when you need to check whether AI/BI published dashboard embedding is enabled, conditionally enabled, or disabled. The default setting is ALLOW_APPROVED_DOMAINS which permits AI/BI dashboards to be embedded on approved domains.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Version identifier for optimistic concurrency control. The response will be at least as fresh as the provided etag. Used to help prevent simultaneous writes from overwriting each other. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update AI/BI Dashboard Embedding Access Policy

**Slug:** `DATABRICKS_SETTINGS_AIBI_DASH_EMBEDDING_ACCESS_POLICY_UPDATE`

Tool to update AI/BI dashboard embedding workspace access policy at the workspace level. Use when you need to control whether AI/BI published dashboard embedding is enabled, conditionally enabled, or disabled. Follows read-modify-write workflow with etag-based optimistic concurrency control to prevent race conditions.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `setting` | object | Yes | The configuration object containing the policy settings to apply. Must include access_policy_type and optionally etag for concurrency control. |
| `field_mask` | string | Yes | Specifies which fields are being updated in the request. Must be a single string with multiple fields separated by commas (no spaces). Supports dot notation to navigate nested fields (e.g., 'aibi_dashboard_embedding_access_policy.access_policy_type'). |
| `allow_missing` | boolean | No | Should always be set to true for Settings API. Added for AIP (Google API Improvement Proposals) compliance. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get AI/BI Dashboard Embedding Approved Domains

**Slug:** `DATABRICKS_SETTINGS_AIBI_DASH_EMBEDDING_APPROVED_DOMAINS_GET`

Tool to retrieve the list of domains approved to host embedded AI/BI dashboards. Use when you need to check which external domains are permitted to embed AI/BI dashboards. The approved domains list cannot be modified unless the workspace access policy is set to ALLOW_APPROVED_DOMAINS.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Version identifier for optimistic concurrency control. The response will be at least as fresh as the provided etag. Used to help prevent simultaneous writes from overwriting each other. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete AI/BI Dashboard Embedding Approved Domains

**Slug:** `DATABRICKS_SETTINGS_AIBI_DASH_EMBEDDING_DOMAINS_DELETE`

Tool to delete the list of approved domains for AI/BI dashboard embedding, reverting to default. Use when you need to remove the workspace-level approved domains list for hosting embedded AI/BI dashboards. Upon deletion, the workspace reverts to an empty approved domains list. The approved domains list cannot be modified when the current access policy is not configured to ALLOW_APPROVED_DOMAINS.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | Yes | Version identifier for optimistic concurrency control. The etag prevents simultaneous writes from overwriting each other. Required: You must use the etag in a read-delete pattern: first make a GET request to retrieve the current etag, then pass it with the DELETE request to identify the specific version being deleted. If the setting is updated or deleted concurrently, the DELETE fails with HTTP 409 Conflict status, requiring retry with the fresh etag from the 409 response |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update AI/BI Dashboard Embedding Approved Domains

**Slug:** `DATABRICKS_SETTINGS_AIBI_DASH_EMBEDDING_DOMAINS_UPDATE`

Tool to update the list of domains approved to host embedded AI/BI dashboards at the workspace level. Use when you need to modify the approved domains list. The approved domains list can only be modified when the current access policy is set to ALLOW_APPROVED_DOMAINS.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `setting` | object | Yes | The setting configuration object containing the values to update |
| `field_mask` | string | Yes | Comma-separated list of field paths specifying which fields to update. Uses dot notation (.) to navigate sub-fields. Example: 'aibi_dashboard_embedding_approved_domains.approved_domains'. Using '*' indicates full replacement, though explicit field listing is recommended |
| `allow_missing` | boolean | No | Should always be set to true for Settings API. Added for AIP compliance. Determines behavior when the setting doesn't exist - if true, creates the setting (upsert behavior); if false, returns an error |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Automatic Cluster Update Setting

**Slug:** `DATABRICKS_SETTINGS_AUTOMATIC_CLUSTER_UPDATE_GET`

Tool to retrieve automatic cluster update setting for the workspace. Use when you need to check whether automatic cluster updates are enabled, view maintenance window configuration, or get restart behavior settings. This setting controls whether clusters automatically update during maintenance windows. Currently in Public Preview.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Used for versioning to ensure the response is at least as fresh as the eTag provided. Used for optimistic concurrency control to help prevent simultaneous writes from overwriting each other. Must be a valid etag value returned from a previous API response |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Automatic Cluster Update Setting

**Slug:** `DATABRICKS_SETTINGS_AUTOMATIC_CLUSTER_UPDATE_UPDATE`

Tool to update workspace automatic cluster update configuration with etag-based concurrency control. Use when you need to enable/disable automatic cluster updates, configure maintenance windows, or adjust restart behavior. Requires Enhanced Security Compliance SKU entitlement and admin access. If the setting is updated concurrently, the PATCH request fails with HTTP 409 requiring retry with fresh etag. Note: The enabled field can only be modified if the workspace has the necessary entitlement; other fields like maintenance_window and restart_even_if_no_updates_available can be configured regardless.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `setting` | object | Yes | The setting configuration object containing automatic cluster update values to update |
| `field_mask` | string | Yes | Comma-separated list of field names to update. Use dot notation for nested fields (e.g., 'automatic_cluster_update_workspace.enabled'). Can use '*' for full replacement, but explicit field specification is recommended |
| `allow_missing` | boolean | No | Should always be set to true for Settings API. Added for AIP compliance. Allows the update to proceed even if the setting doesn't exist yet |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Compliance Security Profile Setting

**Slug:** `DATABRICKS_SETTINGS_COMPL_SECURITY_PROFILE_GET`

Tool to retrieve workspace compliance security profile setting. Use when you need to check whether CSP is enabled or view configured compliance standards. The CSP enables additional monitoring, enforced instance types for inter-node encryption, hardened compute images, and other security controls. Once enabled, this setting represents a permanent workspace change that cannot be disabled.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Version identifier for optimistic concurrency control. Ensures responses are at least as fresh as the provided value, helping prevent simultaneous overwrites |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Dashboard Email Subscriptions Setting

**Slug:** `DATABRICKS_SETTINGS_DASH_EMAIL_SUBS_DELETE`

Tool to delete the dashboard email subscriptions setting, reverting to default value. Use when you need to revert the workspace setting that controls whether schedules or workload tasks for refreshing AI/BI Dashboards can send subscription emails. Upon deletion, the setting reverts to its default value (enabled/true). This is a workspace-level setting.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Version identifier for optimistic concurrency control. The etag helps prevent simultaneous writes from overwriting each other. It is strongly recommended to use the etag in a read-delete pattern: first make a GET request to retrieve the current etag, then pass it with the DELETE request to identify the specific version being deleted. If the setting is updated or deleted concurrently, the DELETE fails with HTTP 409 Conflict status, requiring retry with the fresh etag from the 409 response |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Dashboard Email Subscriptions Setting

**Slug:** `DATABRICKS_SETTINGS_DASH_EMAIL_SUBS_GET`

Tool to retrieve dashboard email subscriptions setting for the workspace. Use when you need to check whether schedules or workload tasks for refreshing AI/BI Dashboards can send subscription emails. By default, this setting is enabled.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | ETag used for versioning. The response is at least as fresh as the eTag provided. This is used for optimistic concurrency control as a way to help prevent simultaneous writes of a setting overwriting each other. It is strongly suggested that systems make use of the etag in the read pattern to perform setting reads in order to get versioning information. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Dashboard Email Subscriptions Setting

**Slug:** `DATABRICKS_SETTINGS_DASH_EMAIL_SUBS_UPDATE`

Tool to update the Dashboard Email Subscriptions setting for the workspace with etag-based concurrency control. Use when you need to enable or disable whether dashboard schedules can send subscription emails. If the setting is updated concurrently, the PATCH request fails with HTTP 409 requiring retry with fresh etag.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `setting` | object | Yes | The setting configuration object containing dashboard email subscriptions values to update |
| `field_mask` | string | Yes | Comma-separated list of field names to update (no spaces). Use dot notation for nested fields. Example: 'boolean_val.value' |
| `allow_missing` | boolean | No | Should always be set to true for Settings API. Allows the update to proceed even if the setting doesn't exist yet |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Default Namespace Setting

**Slug:** `DATABRICKS_SETTINGS_DEFAULT_NAMESPACE_DELETE`

Tool to delete the default namespace setting for the workspace, removing the default catalog configuration. Use when you need to remove the default catalog used for queries without fully qualified names. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409).

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Version identifier used for optimistic concurrency control. Helps prevent simultaneous writes from overwriting each other. Should be obtained from a preceding GET request. If concurrent modification occurs, the request fails with HTTP 409 Conflict status, requiring retry with the fresh etag from the 409 response |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Default Namespace Setting

**Slug:** `DATABRICKS_SETTINGS_DEFAULT_NAMESPACE_GET`

Tool to retrieve the default catalog namespace setting for the workspace. Use when you need to check which catalog is used for unqualified table references in Unity Catalog-enabled compute. Changes to this setting require restart of clusters and SQL warehouses to take effect.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Version identifier used for optimistic concurrency control. The response will be at least as fresh as the provided eTag value. Used to ensure you're working with the latest version of the setting |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Default Namespace Setting

**Slug:** `DATABRICKS_SETTINGS_DEFAULT_NAMESPACE_UPDATE`

Tool to update the default catalog namespace configuration for workspace queries with etag-based concurrency control. Use when you need to configure the default catalog used for queries without fully qualified three-level names. Requires a restart of clusters and SQL warehouses to take effect. Only applies to Unity Catalog-enabled compute. If concurrent updates occur, the request fails with 409 status requiring retry with fresh etag.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `setting` | object | Yes | The setting configuration object containing default namespace values to update. Must include etag obtained from a GET request |
| `field_mask` | string | Yes | Field mask in dot notation specifying which fields to update. Supports comma-separated values (no spaces). Use '*' for full replacement. Field names must exactly match the resource field names |
| `allow_missing` | boolean | No | Should always be set to true for Settings API. Added for AIP compliance. Allows the update to proceed even if the setting doesn't exist yet |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Default Warehouse ID Setting

**Slug:** `DATABRICKS_SETTINGS_DEFAULT_WAREHOUSE_ID_DELETE`

Tool to delete the default warehouse ID setting for the workspace, reverting to default state. Use when you need to remove the default SQL warehouse configuration. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409).

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Version identifier used for optimistic concurrency control. Helps prevent simultaneous writes from overwriting each other. Should be obtained from a preceding GET request. If concurrent modification occurs, the request fails with HTTP 409 Conflict status, requiring retry with the fresh etag from the 409 response |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Default Warehouse ID Setting

**Slug:** `DATABRICKS_SETTINGS_DEFAULT_WAREHOUSE_ID_GET`

Tool to retrieve the default SQL warehouse ID setting for the workspace. Use when you need to check which warehouse is configured as the default for SQL authoring surfaces, AI/BI dashboards, Genie, Alerts, and Catalog Explorer.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Version identifier used for optimistic concurrency control. The response will be at least as fresh as the provided eTag value. Used to ensure you're working with the latest version of the setting |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Default Warehouse ID Setting

**Slug:** `DATABRICKS_SETTINGS_DEFAULT_WAREHOUSE_ID_UPDATE`

Tool to update the default SQL warehouse configuration for the workspace with etag-based concurrency control. Use when you need to configure which warehouse is used as the default for SQL operations and queries in the workspace. If concurrent updates occur, the request fails with 409 status requiring retry with fresh etag.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `setting` | object | Yes | The setting configuration object containing default warehouse ID values to update. Must include etag obtained from a GET request |
| `field_mask` | string | Yes | Field mask in dot notation specifying which fields to update. Supports comma-separated values (no spaces). Use '*' for full replacement. Field names must exactly match the resource field names |
| `allow_missing` | boolean | No | Should always be set to true for Settings API. Added for AIP compliance. Allows the update to proceed even if the setting doesn't exist yet |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Disable Legacy Access Setting

**Slug:** `DATABRICKS_SETTINGS_DISABLE_LEGACY_ACCESS_DELETE`

Tool to delete the disable legacy access workspace setting, re-enabling legacy features. Use when you need to revert to allowing legacy Databricks features. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409). Changes take up to 5 minutes and require cluster/warehouse restart.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Version identifier used for optimistic concurrency control. Helps prevent simultaneous writes from overwriting each other. Should be obtained from a preceding GET request. If concurrent modification occurs, the request fails with HTTP 409 Conflict status, requiring retry with the fresh etag from the 409 response |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Disable Legacy Access Setting

**Slug:** `DATABRICKS_SETTINGS_DISABLE_LEGACY_ACCESS_GET`

Tool to retrieve the disable legacy access workspace setting. Use when you need to check whether legacy feature access is disabled, including direct Hive Metastore access, Fallback Mode on external locations, and Databricks Runtime versions prior to 13.3 LTS.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Version identifier used for optimistic concurrency control. The response will be at least as fresh as the provided eTag value. Used to ensure you're working with the latest version of the setting |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Disable Legacy Access Setting

**Slug:** `DATABRICKS_SETTINGS_DISABLE_LEGACY_ACCESS_UPDATE`

Tool to enable the workspace disable legacy access setting with optional etag-based concurrency control. Use when you need to enable (not disable) restrictions on legacy features including direct Hive Metastore access, external location fallback mode, and Databricks Runtime versions prior to 13.3LTS. Note: This setting can only be set to true (enabled), not false. The etag parameter is optional but recommended for preventing concurrent modification conflicts.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `setting` | object | Yes | The setting configuration object containing disable legacy access values to update. The etag field is optional but recommended for concurrency control |
| `field_mask` | string | Yes | Comma-separated list of field names to update (no spaces). Use dot notation for nested fields (e.g., 'disable_legacy_access.value'). Explicitly list fields instead of using '*' wildcards |
| `allow_missing` | boolean | No | Must always be set to true for Settings API. Added for AIP compliance. Allows the update to proceed even if the setting doesn't exist yet |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Disable Legacy DBFS Setting

**Slug:** `DATABRICKS_SETTINGS_DISABLE_LEGACY_DBFS_DELETE`

Tool to delete the disable legacy DBFS workspace setting, reverting to default DBFS access behavior. Use when you need to re-enable access to legacy DBFS root and mounts. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409).

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Version identifier used for optimistic concurrency control. Helps prevent simultaneous writes from overwriting each other. Should be obtained from a preceding GET request. If concurrent modification occurs, the request fails with HTTP 409 Conflict status, requiring retry with the fresh etag from the 409 response |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Disable Legacy DBFS Setting

**Slug:** `DATABRICKS_SETTINGS_DISABLE_LEGACY_DBFS_GET`

Tool to retrieve the disable legacy DBFS workspace setting. Use when you need to check whether legacy DBFS root and mount access is disabled across all interfaces (UI, APIs, CLI, FUSE). When enabled, this setting also disables Databricks Runtime versions prior to 13.3 LTS and requires manual restart of compute clusters and SQL warehouses to take effect.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Version identifier used for optimistic concurrency control. The response will be at least as fresh as the provided eTag value. Used to ensure you're working with the latest version of the setting |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Disable Legacy DBFS Setting

**Slug:** `DATABRICKS_SETTINGS_DISABLE_LEGACY_DBFS_UPDATE`

Tool to update workspace disable legacy DBFS setting with etag-based concurrency control. Use when you need to enable or disable legacy DBFS features including DBFS root access, mounts, and legacy Databricks Runtime versions prior to 13.3 LTS. Changes take up to 20 minutes to take effect and require manual restart of compute clusters and SQL warehouses.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `setting` | object | Yes | The setting configuration object containing disable legacy DBFS values to update. Include etag from GET request for concurrency control. The bool_val field contains the DisableLegacyDbfs configuration |
| `field_mask` | string | Yes | Comma-separated list of field names to update (no spaces). Use dot notation for nested fields (e.g., 'disable_legacy_dbfs.value'). Explicitly list fields instead of using '*' wildcards |
| `allow_missing` | boolean | No | Must always be set to true for Settings API. Added for AIP compliance. Allows the update to proceed even if the setting doesn't exist yet |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Enable Export Notebook Setting

**Slug:** `DATABRICKS_SETTINGS_ENABLE_EXPORT_NOTEBOOK_GET_ENABLE`

Tool to retrieve workspace setting controlling notebook export functionality. Use when you need to check whether users can export notebooks and files from the Workspace UI. Administrators use this setting to manage data exfiltration controls.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Version identifier used for optimistic concurrency control. The response will be at least as fresh as the provided etag value. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Enable Export Notebook

**Slug:** `DATABRICKS_SETTINGS_ENABLE_EXPORT_NOTEBOOK_PATCH_ENABLE`

Tool to update workspace notebook and file export setting. Use when you need to enable or disable users' ability to export notebooks and files from the Workspace UI. Requires admin access.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `setting` | object | Yes | The setting configuration object containing enable export notebook values to update |
| `field_mask` | string | Yes | Comma-separated list of field names to update. Use dot notation for nested fields (e.g., 'boolean_val.value'). Can use '*' for full replacement, but explicit field specification is recommended |
| `allow_missing` | boolean | No | Should always be set to true for Settings API. Added for AIP compliance. Allows the update to proceed even if the setting doesn't exist yet |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Enable Notebook Table Clipboard Setting

**Slug:** `DATABRICKS_SETTINGS_ENABLE_NOTEBOOK_TABLE_CLIPBOARD_GET`

Tool to retrieve notebook table clipboard setting for the workspace. Use when you need to check whether notebook table clipboard functionality is enabled. This setting controls whether users can copy data from tables in notebooks to their clipboard.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Version identifier used for optimistic concurrency control. The response will be at least as fresh as the provided etag value. Used to help prevent simultaneous writes from overwriting each other. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Enable Notebook Table Clipboard

**Slug:** `DATABRICKS_SETTINGS_ENABLE_NOTEBOOK_TABLE_CLIPBOARD_PATCH`

Tool to update workspace setting for notebook table clipboard. Use when you need to enable or disable users' ability to copy tabular data from notebook result tables to clipboard. Requires workspace admin privileges.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `setting` | object | Yes | The setting configuration object containing enable notebook table clipboard values to update |
| `field_mask` | string | Yes | Comma-separated list of field names to update. Use dot notation for nested fields. Must be 'boolean_val.value' to update the setting value. Can use '*' for full replacement, but explicit field specification is recommended |
| `allow_missing` | boolean | No | Should always be set to true for Settings API. Added for AIP compliance. Allows the update to proceed even if the setting doesn't exist yet |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Enable Results Downloading Setting

**Slug:** `DATABRICKS_SETTINGS_ENABLE_RESULTS_DOWNLOADING_GET_ENABLE`

Tool to retrieve workspace setting controlling notebook results download functionality. Use when you need to check whether users can download notebook query results. Requires workspace administrator privileges to access.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Version identifier used for optimistic concurrency control. The response will be at least as fresh as the provided etag value. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Enable Results Downloading

**Slug:** `DATABRICKS_SETTINGS_ENABLE_RESULTS_DOWNLOADING_PATCH`

Tool to update workspace notebook results download setting. Use when you need to enable or disable users' ability to download notebook results. Requires admin access.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `setting` | object | Yes | The setting configuration object containing enable results downloading values to update |
| `field_mask` | string | Yes | Comma-separated list of field names to update. Use dot notation for nested fields. Use 'boolean_val.value' to update only the boolean value, or 'boolean_val' to update the entire boolean_val object. The 'setting.' prefix should NOT be included in the field mask |
| `allow_missing` | boolean | No | Should always be set to true for Settings API. Added for AIP compliance. Allows the update to proceed even if the setting doesn't exist yet |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Enhanced Security Monitoring Setting

**Slug:** `DATABRICKS_SETTINGS_ENHANCED_SECURITY_MONITORING_GET`

Tool to retrieve enhanced security monitoring workspace setting. Use when you need to check whether Enhanced Security Monitoring is enabled for the workspace. Enhanced Security Monitoring provides a hardened disk image and additional security monitoring agents. It is automatically enabled when compliance security profile is active, and can be manually toggled when compliance security profile is disabled.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Used for versioning to ensure the response is at least as fresh as the eTag provided. Used for optimistic concurrency control to help prevent simultaneous writes from overwriting each other. Must be a valid base64-encoded etag string returned from a previous GET request |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Enhanced Security Monitoring

**Slug:** `DATABRICKS_SETTINGS_ENHANCED_SECURITY_MONITORING_UPDATE`

Tool to update enhanced security monitoring workspace settings. Use when you need to enable or disable Enhanced Security Monitoring (ESM) for the workspace. Requires the etag from a previous GET request for optimistic concurrency control.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `setting` | object | Yes | The enhanced security monitoring setting configuration object. |
| `field_mask` | string | Yes | Comma-separated list of field paths to update (no spaces). Supports dot notation for nested fields (e.g., 'enhanced_security_monitoring_workspace.is_enabled'). Cannot specify individual sequence or map elements. Use '*' for full replacement. |
| `allow_missing` | boolean | Yes | This should always be set to true for Settings API. Added for AIP compliance. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create IP Access List

**Slug:** `DATABRICKS_SETTINGS_IP_ACCESS_LISTS_CREATE`

Tool to create a new IP access list for workspace access control. Use when you need to allow or block specific IP addresses and CIDR ranges from accessing the Databricks workspace. The API will reject creation if the resulting list would block the caller's current IP address. Changes may take a few minutes to take effect.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `label` | string | Yes | Label for the IP access list. This cannot be empty. A descriptive name for the IP access list (e.g., 'office', 'vpn-users') |
| `enabled` | boolean | No | Specifies whether this list is enabled. Pass true or false. Defaults to true if not specified |
| `list_type` | string ("ALLOW" | "BLOCK") | Yes | Type of IP access list. 'ALLOW' creates an allow list to include IP addresses/ranges. 'BLOCK' creates a block list to exclude IP addresses/ranges (IPs in block list are excluded even if included in an allow list) |
| `ip_addresses` | array | No | A JSON array of IP addresses and CIDR ranges as string values. Example: ['1.1.1.0/24', '2.2.2.2/32']. Maximum of 1000 IP/CIDR values combined across all allow and block lists. Only IPv4 addresses are supported |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get IP Access List

**Slug:** `DATABRICKS_SETTINGS_IP_ACCESS_LISTS_GET`

Tool to retrieve details of a specific IP access list by its ID. Use when you need to view the configuration of allowed or blocked IP addresses and subnets for accessing the workspace or workspace-level APIs. Requires workspace admin privileges.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `ip_access_list_id` | string | Yes | The unique identifier (UUID) for the IP access list to retrieve |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get LLM Proxy Partner Powered Setting

**Slug:** `DATABRICKS_SETTINGS_LLM_PROXY_PARTNER_POWERED_WORKSPACE_GET`

Tool to retrieve workspace-level setting that controls whether partner-powered AI features are enabled. Use when you need to check if features like Databricks Assistant, Genie, and Data Science Agent can use models hosted by partner providers (Azure OpenAI or Anthropic). By default, this setting is enabled for non-CSP workspaces.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Version identifier for optimistic concurrency control to prevent simultaneous writes from overwriting each other. Should be retrieved from GET and passed with subsequent PATCH requests |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete LLM Proxy Partner Powered Setting

**Slug:** `DATABRICKS_SETTINGS_LLM_PROXY_PARTNER_WORKSPACE_DELETE`

Tool to delete (revert to default) the partner-powered AI features workspace setting. Use when you need to revert the workspace to default configuration for AI features powered by partner providers. By default, this setting is enabled for workspaces without a compliance security profile. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409).

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Version identifier used for optimistic concurrency control. Helps prevent simultaneous writes from overwriting each other. Should be obtained from a preceding GET request. If concurrent modification occurs, the request fails with HTTP 409 Conflict status, requiring retry with the fresh etag from the 409 response |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update LLM Proxy Partner Powered Setting

**Slug:** `DATABRICKS_SETTINGS_LLM_PROXY_PARTNER_WORKSPACE_UPDATE`

Updates workspace-level setting that controls whether AI features are powered by partner-hosted models. Use to enable/disable partner-powered AI features (Azure OpenAI or Anthropic on Databricks). When disabled, Databricks-hosted models are used instead. IMPORTANT: You must first call DATABRICKS_SETTINGS_LLM_PROXY_PARTNER_POWERED_WORKSPACE_GET to obtain the current etag before updating. If concurrent updates occur, request fails with 409 status - retry with the fresh etag from the 409 response.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `setting` | object | Yes | The setting configuration object containing LLM proxy partner powered values to update. Must include etag obtained from a GET request |
| `field_mask` | string | Yes | Comma-separated list of fields to update (no spaces). Field paths are relative to the setting object using dot notation for sub-fields. Use 'boolean_val.value' to update just the boolean value (recommended), or 'boolean_val' to replace the entire object. A field mask of '*' indicates full replacement |
| `allow_missing` | boolean | No | Should always be set to true for Settings API. Added for AIP compliance. Allows the setting to be created if it doesn't exist yet |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Notification Destination

**Slug:** `DATABRICKS_SETTINGS_NOTIF_DESTS_CREATE`

Tool to create a notification destination for alerts and jobs. Use when you need to set up destinations for sending notifications outside of Databricks (email, Slack, PagerDuty, Microsoft Teams, or webhooks). Only workspace admins can create notification destinations. Requires HTTPS for webhooks with SSL certificates signed by a trusted certificate authority.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `config` | object | Yes | The configuration for the notification destination. Must contain exactly one destination type (email, slack, microsoft_teams, pagerduty, or generic_webhook) |
| `display_name` | string | No | The display name for the notification destination |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Notification Destination

**Slug:** `DATABRICKS_SETTINGS_NOTIF_DESTS_DELETE`

Tool to delete a notification destination from the Databricks workspace. Use when you need to permanently remove a notification destination. Only workspace administrators have permission to perform this delete operation.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The UUID identifying the notification destination to delete |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Notification Destination

**Slug:** `DATABRICKS_SETTINGS_NOTIF_DESTS_UPDATE`

Tool to update an existing notification destination configuration. Use when you need to modify display name or configuration settings for email, Slack, PagerDuty, Microsoft Teams, or webhook destinations. Requires workspace admin permissions. At least one field (display_name or config) must be provided.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | UUID identifying the notification destination to update |
| `config` | object | No | Configuration for notification destination update. Must contain exactly one destination type. |
| `display_name` | string | No | The display name for the notification destination |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Notification Destination

**Slug:** `DATABRICKS_SETTINGS_NOTIFICATION_DESTINATIONS_GET`

Tool to retrieve details of a notification destination by its UUID identifier. Use when you need to get configuration details, display name, and type information for a specific notification destination. Only users with workspace admin permissions will see the full configuration details.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The UUID identifying the notification destination to retrieve |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Restrict Workspace Admins Setting

**Slug:** `DATABRICKS_SETTINGS_RESTRICT_WORKSPACE_ADMINS_DELETE`

Tool to delete/revert the restrict workspace admins setting to its default state. Use when you need to restore default workspace administrator capabilities for service principal token creation and job ownership settings. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409).

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Version identifier used for optimistic concurrency control. Helps prevent simultaneous writes from overwriting each other. Should be obtained from a preceding GET request. If concurrent modification occurs, the request fails with HTTP 409 Conflict status, requiring retry with the fresh etag from the 409 response |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Restrict Workspace Admins Setting

**Slug:** `DATABRICKS_SETTINGS_RESTRICT_WORKSPACE_ADMINS_GET`

Tool to retrieve the restrict workspace admins setting for the workspace. Use when you need to check whether workspace administrators are restricted in their ability to create service principal tokens, change job owners, or modify job run_as settings. This setting controls security boundaries for admin privileges.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Used for versioning to ensure the response is at least as fresh as the eTag provided. Used for optimistic concurrency control to help prevent simultaneous writes from overwriting each other |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Restrict Workspace Admins Setting

**Slug:** `DATABRICKS_SETTINGS_RESTRICT_WORKSPACE_ADMINS_UPDATE`

Tool to update the restrict workspace admins setting with etag-based concurrency control. Use when you need to modify workspace administrator capabilities for service principal token creation and job ownership/run-as settings. Requires account admin permissions and workspace membership. If concurrent updates occur, the request fails with HTTP 409 requiring retry with fresh etag.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `setting` | object | Yes | The setting configuration object containing restrict workspace admins values to update |
| `field_mask` | string | Yes | Comma-separated list of field names to update. Use dot notation for nested fields. No spaces between fields |
| `allow_missing` | boolean | No | Should always be set to true for Settings API. Added for AIP compliance |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete SQL Results Download Setting

**Slug:** `DATABRICKS_SETTINGS_SQL_RESULTS_DOWNLOAD_DELETE`

Tool to delete SQL results download workspace setting, reverting to default state where users are permitted to download results. Use when you need to restore the factory default configuration. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409).

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Version identifier used for optimistic concurrency control. Helps prevent simultaneous writes from overwriting each other. Should be obtained from a preceding GET request. If concurrent modification occurs, the request fails with HTTP 409 Conflict status, requiring retry with the fresh etag from the 409 response |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get SQL Results Download Setting

**Slug:** `DATABRICKS_SETTINGS_SQL_RESULTS_DOWNLOAD_GET`

Tool to retrieve SQL results download workspace setting. Use when you need to check whether users within the workspace are allowed to download results from the SQL Editor and AI/BI Dashboards UIs. By default, this setting is enabled (set to true). Returns etag for use in subsequent update/delete operations.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `etag` | string | No | Version identifier for optimistic concurrency control. Used to ensure the response is at least as fresh as the provided eTag. Recommended for read-then-delete patterns to prevent race conditions |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update SQL Results Download Setting

**Slug:** `DATABRICKS_SETTINGS_SQL_RESULTS_DOWNLOAD_UPDATE`

Tool to update workspace SQL results download setting controlling whether users can download results from SQL Editor and AI/BI Dashboards. Use when you need to enable or disable SQL query results download capability. Requires workspace admin access and uses etag-based optimistic concurrency control to prevent conflicting updates.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `setting` | object | Yes | The setting configuration object containing SQL results download values to update |
| `field_mask` | string | Yes | Comma-separated list of field names to update. Use dot notation for nested fields (e.g., 'boolean_val.value'). Can use '*' for full replacement, but explicit field specification is recommended |
| `allow_missing` | boolean | No | Should always be set to true for Settings API. Added for AIP compliance. Allows the update to proceed even if the setting doesn't exist yet |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Token via Token Management

**Slug:** `DATABRICKS_SETTINGS_TOKEN_MANAGEMENT_DELETE`

Tool to delete a token specified by ID via token management. Use when you need to revoke or remove access tokens. Admins can delete tokens for any user.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `token_id` | string | Yes | The ID of the token to delete/revoke. Admins can delete tokens for any user |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Token Information

**Slug:** `DATABRICKS_SETTINGS_TOKEN_MANAGEMENT_GET`

Tool to retrieve detailed information about a specific token by its ID from the token management system. Use when you need to get token metadata including creation time, expiry, owner, and usage information. Requires appropriate permissions to access token information.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `token_id` | string | Yes | The ID of the token to retrieve. This is the unique identifier for the token in the token management system |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Token Management Permission Levels

**Slug:** `DATABRICKS_SETTINGS_TOKEN_MGMT_GET_PERM_LEVELS`

Tool to retrieve available permission levels for personal access token management. Use when you need to understand what permission levels can be assigned for managing tokens in the workspace. Returns permission levels like CAN_USE and CAN_MANAGE with their descriptions.

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Token Management Permissions

**Slug:** `DATABRICKS_SETTINGS_TOKEN_MGMT_GET_PERMS`

Tool to retrieve permissions for workspace token management. Use when you need to check which users, groups, and service principals have permissions to create and manage personal access tokens. Requires workspace admin privileges and is available only in Databricks Premium plan.

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Set Token Management Permissions

**Slug:** `DATABRICKS_SETTINGS_TOKEN_MGMT_SET_PERMS`

Tool to set permissions for personal access token management, replacing all existing permissions. Use when configuring which users, groups, and service principals can create and use tokens. This operation replaces ALL existing permissions; if you need to add or modify permissions without replacing existing ones, use the update_permissions method instead. Workspace admins always retain CAN_MANAGE permissions.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `access_control_list` | array | Yes | Array of access control entries. This replaces ALL existing permissions. For each entry, only one of user_name, group_name, or service_principal_name should be specified. If none are specified, all direct permissions are deleted |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Token Management Permissions

**Slug:** `DATABRICKS_SETTINGS_TOKEN_MGMT_UPDATE_PERMS`

Tool to incrementally update permissions for personal access token management. Use when you need to modify who can create and use personal access tokens. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `access_control_list` | array | Yes | List of access control entries to update. This operation updates only the specified permissions incrementally, preserving existing permissions not included in the request. For each entry, exactly one of user_name, group_name, or service_principal_name must be specified |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Personal Access Token

**Slug:** `DATABRICKS_SETTINGS_TOKENS_CREATE`

Tool to create a personal access token (PAT) for Databricks API authentication. Use when you need to generate a new token for REST API requests. Each PAT is valid for only one workspace. Users can create up to 600 PATs per workspace. Databricks automatically revokes PATs that haven't been used for 90 days.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `comment` | string | No | A comment or description to attach to the token for identification purposes. This will appear on the user's settings page |
| `lifetime_seconds` | integer | No | The lifetime of the token in seconds. If not specified or set to 0, the token will not expire (or will use the maximum allowed by workspace configuration). Example: 7776000 for 90 days |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Public Workspace Setting

**Slug:** `DATABRICKS_SETTINGSV2_WORKSPACE_SETTINGS_V2_GET_PUBLIC`

Retrieves the current configuration of a workspace-level setting. Use this to get workspace settings such as admin restrictions, automatic cluster updates, default SQL namespaces, security profiles, and monitoring configurations. The response includes an etag for version control, enabling safe concurrent updates. Common use cases include checking current security settings, verifying cluster update policies, or retrieving namespace defaults before making changes.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The setting identifier path in the format 'types/{setting_type}/names/{setting_name}'. Common setting types: 'restrict_workspace_admins' (admin restrictions), 'automatic_cluster_update' (cluster auto-updates), 'default_namespace' (SQL default catalog/schema), 'compliance_security_profile' (security settings), 'enhanced_security_monitoring' (monitoring settings). The setting name is typically 'default' |
| `etag` | string | No | Optional version identifier for cache freshness. When provided, guarantees the response is at least as fresh as this etag. Use this to ensure you're not reading stale data |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Set Workspace Configuration Status

**Slug:** `DATABRICKS_SETTINGS_WORKSPACE_CONF_SET_STATUS`

Set workspace-level configuration settings for a Databricks workspace. Commonly used to configure the maximum token lifetime for personal access tokens (maxTokenLifetimeDays). Updates are applied immediately and affect workspace behavior. Requires workspace admin permissions. Invalid or unsupported configuration keys will return a 400 Bad Request error.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `configurations` | object | Yes | Dictionary of workspace configuration key-value pairs to set. All values must be strings. The most commonly used key is 'maxTokenLifetimeDays' (e.g., '90', '120', '60') which sets the maximum lifetime in days for new personal access tokens. Other workspace configuration keys depend on the workspace tier and features enabled. Invalid keys will result in a 400 Bad Request error. Note: This action updates workspace-level settings and requires admin permissions. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Sharing Provider

**Slug:** `DATABRICKS_SHARING_PROVIDERS_CREATE`

Tool to create a new authentication provider in Unity Catalog for Delta Sharing. Use when establishing a provider object for receiving data from external sources that aren't Unity Catalog-enabled. Requires metastore admin privileges or CREATE_PROVIDER permission on the metastore. Most recipients should not need to create provider objects manually as they are typically auto-created during Delta Sharing.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the provider. Must be unique within the metastore |
| `comment` | string | No | Description about the provider for documentation purposes |
| `authentication_type` | string ("TOKEN" | "DATABRICKS" | "OAUTH_CLIENT_CREDENTIALS" | "OIDC_FEDERATION") | Yes | Authentication mechanism for the provider. TOKEN for Delta Sharing open protocol, DATABRICKS for Databricks-to-Databricks sharing, OAUTH_CLIENT_CREDENTIALS for OAuth, OIDC_FEDERATION for OpenID Connect |
| `recipient_profile_str` | string | No | JSON string containing Delta Sharing credentials (required for TOKEN, OAUTH_CLIENT_CREDENTIALS authentication types). Should include shareCredentialsVersion, endpoint, and bearerToken. For DATABRICKS authentication, this is the sharing identifier in format <cloud>:<region>:<metastore-uuid> |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Sharing Provider

**Slug:** `DATABRICKS_SHARING_PROVIDERS_GET`

Tool to retrieve information about a specific Delta Sharing provider in Unity Catalog. Use when you need to get provider details including authentication type, ownership, and connection information. Requires metastore admin privileges or provider ownership.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the provider to retrieve |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Sharing Provider

**Slug:** `DATABRICKS_SHARING_PROVIDERS_UPDATE`

Tool to update an existing Delta Sharing authentication provider in Unity Catalog. Use when you need to modify provider properties like comment, owner, or name. The caller must be either a metastore admin or the owner of the provider. To rename the provider, the caller must be BOTH a metastore admin AND the owner.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The current name of the provider to update. This is used in the path parameter to identify the provider |
| `owner` | string | No | Username of the new provider owner. Must be the current provider owner to change this |
| `comment` | string | No | Description or comment about the provider |
| `new_name` | string | No | New name for the provider if renaming. Requires both metastore admin and owner privileges |
| `recipient_profile_str` | string | No | JSON string containing recipient profile configuration. Required when authentication_type is TOKEN, OAUTH_CLIENT_CREDENTIALS, or not provided. Format: {"shareCredentialsVersion":1,"bearerToken":"<token>","endpoint":"<delta-sharing-endpoint-url>"} |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Sharing Recipient

**Slug:** `DATABRICKS_SHARING_RECIPIENTS_CREATE`

Tool to create a Delta Sharing recipient in Unity Catalog metastore. Use when you need to create a recipient object representing an identity who will consume shared data. Recipients can be configured for Databricks-to-Databricks sharing or open sharing with token authentication. Requires metastore admin or CREATE_RECIPIENT privilege.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | Name of the recipient. Must be unique within the metastore |
| `owner` | string | No | Username/groupname/service principal application_id of the recipient owner |
| `comment` | string | No | Description about the recipient |
| `sharing_code` | string | No | The one-time sharing code provided by the data recipient. Used with DATABRICKS authentication |
| `ip_access_list` | object | No | IP access list configuration for recipient. |
| `expiration_time` | integer | No | Expiration timestamp of the token in epoch milliseconds. Used with TOKEN authentication |
| `properties_kvpairs` | object | No | Recipient properties object. |
| `authentication_type` | string ("DATABRICKS" | "OAUTH_CLIENT_CREDENTIALS" | "OIDC_FEDERATION" | "TOKEN") | Yes | The delta sharing authentication type |
| `data_recipient_global_metastore_id` | string | No | Required when authentication_type is DATABRICKS. The global Unity Catalog metastore id provided by the data recipient |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Sharing Recipient

**Slug:** `DATABRICKS_SHARING_RECIPIENTS_DELETE`

Tool to delete a Delta Sharing recipient from Unity Catalog metastore. Use when you need to permanently remove a recipient object. Deletion invalidates all access tokens and immediately revokes access to shared data for users represented by the recipient. Requires recipient owner privileges.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the recipient to delete. The caller must be the owner of the recipient |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Sharing Recipient

**Slug:** `DATABRICKS_SHARING_RECIPIENTS_GET`

Tool to retrieve a Delta Sharing recipient from Unity Catalog metastore by name. Use when you need to get information about a recipient object representing an entity that receives shared data. Requires recipient ownership or metastore admin privileges.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | Name of the recipient to retrieve |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Share

**Slug:** `DATABRICKS_SHARING_SHARES_CREATE`

Tool to create a new share for data objects in Unity Catalog. Use when you need to establish a share for distributing data assets via Delta Sharing protocol. Data objects can be added after creation with update. Requires metastore admin or CREATE_SHARE privilege on the metastore.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | Name of the share. Must be unique within the metastore |
| `comment` | string | No | User-provided free-form text description of the share |
| `storage_root` | string | No | Storage root URL for the share (e.g., s3://bucket/path, abfss://container@storage.dfs.core.windows.net/path) |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Share

**Slug:** `DATABRICKS_SHARING_SHARES_DELETE`

Tool to delete a Unity Catalog share from the metastore. Use when you need to permanently remove a share object. Deletion immediately revokes recipient access to the shared data. This operation is permanent and requires share owner privileges.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the share to delete. The caller must be an owner of the share. This is the unique identifier for the share object in the Unity Catalog metastore |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Share Details

**Slug:** `DATABRICKS_SHARING_SHARES_GET`

Tool to retrieve details of a specific share from Unity Catalog. Use when you need to get information about a share including its metadata, owner, and optionally the list of shared data objects. Requires metastore admin privileges or share ownership.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the share to retrieve. Must be unique within the metastore |
| `include_shared_data` | boolean | No | Query for data to include in the share. When set to true, returns the list of shared data objects within the share |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Share Permissions

**Slug:** `DATABRICKS_SHARING_SHARES_SHARE_PERMISSIONS`

Tool to retrieve permissions for a Delta Sharing share from Unity Catalog. Use when you need to check which principals have been granted privileges on a share. Requires metastore admin privileges or share ownership.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the share. This is the unique identifier for the Delta Sharing share in the Unity Catalog metastore |
| `page_token` | string | No | Opaque pagination token to retrieve the next page of results based on a previous query |
| `max_results` | integer | No | Maximum number of permissions to return. When set to 0, uses server-configured page length; when greater than 0, uses minimum of this value and server-configured value |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Share

**Slug:** `DATABRICKS_SHARING_SHARES_UPDATE`

Tool to update an existing share in Unity Catalog with changes to metadata or data objects. Use when you need to modify share properties (comment, owner, name) or manage shared data objects (add, remove, or update tables/views/volumes). The caller must be a metastore admin or the owner of the share. For table additions, the owner must have SELECT privilege on the table.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The name of the share to update. This is used in the path parameter to identify the share |
| `owner` | string | No | Username of current owner of share. Only metastore admins can update this field alone |
| `comment` | string | No | User-provided free-form text description of the share |
| `updates` | array | No | Array of shared data object updates (ADD, REMOVE, UPDATE). Maximum 100 update data objects allowed. For ADD operations, the share owner must have SELECT privilege on the table. For UPDATE operations, specify the action and the data object with the changes |
| `new_name` | string | No | New name for the share (for renaming). Requires CREATE_SHARE privilege |
| `storage_root` | string | No | Storage root URL for the share. Cannot be updated if there are notebook files in the share |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create SQL Alert

**Slug:** `DATABRICKS_SQL_ALERTS_CREATE`

Tool to create a new Databricks SQL alert for query monitoring. Use when you need to set up alerts that monitor query results and trigger notifications when specified conditions are met. The alert will evaluate the query results and send notifications when the condition threshold is crossed.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `alert` | object | Yes | The alert configuration object containing all alert settings |
| `auto_resolve_display_name` | boolean | No | If true, automatically resolve alert display name conflicts. Otherwise, fail the request if display name conflicts with existing alert |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete SQL Alert

**Slug:** `DATABRICKS_SQL_ALERTS_DELETE`

Tool to delete a Databricks SQL alert (soft delete to trash). Use when you need to remove an alert from active monitoring. The alert is moved to trash and can be restored through the UI. Trashed alerts are automatically cleaned up after 30 days. Note: Deleting an already-deleted alert will return an error (not idempotent).

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier (UUID) of the SQL alert to delete. The alert will be moved to trash (soft delete) and can be restored through the UI. Trashed alerts are automatically cleaned up after 30 days |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get SQL Alert Details

**Slug:** `DATABRICKS_SQL_ALERTS_GET`

Tool to retrieve details of a specific Databricks SQL alert by its UUID. Use when you need to get information about an alert including its configuration, trigger conditions, state, and notification settings.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The UUID identifying the alert to retrieve |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Legacy SQL Alert

**Slug:** `DATABRICKS_SQL_ALERTS_LEGACY_CREATE`

Tool to create a legacy SQL alert that periodically runs a query and notifies when conditions are met. Use when you need to create alerts using the legacy API endpoint. Note: This is a legacy endpoint that has been replaced by /api/2.0/sql/alerts and is deprecated.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | Name of the alert |
| `rearm` | integer | No | Number of seconds after being triggered before the alert rearms itself and can be triggered again. If not defined, alert will never be triggered again |
| `parent` | string | No | The identifier of the workspace folder containing the alert. Default is the user's home folder. Format: folders/<folder_id> |
| `options` | object | Yes | Alert configuration options including column, operator, and comparison value |
| `query_id` | string | Yes | ID of the query evaluated by the alert |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Legacy SQL Alert

**Slug:** `DATABRICKS_SQL_ALERTS_LEGACY_DELETE`

Tool to permanently delete a legacy SQL alert (permanent deletion). Use when you need to permanently remove an alert using the legacy API endpoint. Note: This is a legacy endpoint that permanently deletes alerts. Unlike the newer /api/2.0/sql/alerts endpoint, deleted alerts cannot be restored from trash.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier (UUID or numeric ID) of the SQL alert to permanently delete. This is a legacy API endpoint that permanently deletes alerts without moving them to trash |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Legacy SQL Alert

**Slug:** `DATABRICKS_SQL_ALERTS_LEGACY_GET`

Tool to retrieve details of a specific legacy SQL alert by its ID. Use when you need to get information about a legacy alert including its configuration, state, query details, and notification settings. Note: This is a legacy endpoint (/api/2.0/preview/sql/alerts) that is deprecated and being replaced by /api/2.0/sql/alerts.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier (UUID) of the SQL alert to retrieve |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Legacy SQL Alerts

**Slug:** `DATABRICKS_SQL_ALERTS_LEGACY_LIST`

Tool to list all legacy SQL alerts accessible to the authenticated user. Use when you need to retrieve a list of all legacy alerts in the workspace. Note: This is a legacy endpoint (/api/2.0/preview/sql/alerts) that is deprecated and being replaced by /api/2.0/sql/alerts.

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Legacy SQL Alert

**Slug:** `DATABRICKS_SQL_ALERTS_LEGACY_UPDATE`

Tool to update a legacy SQL alert configuration including name, query reference, trigger conditions, and notification settings. Use when you need to modify existing alerts using the legacy API endpoint. Note: This is a legacy endpoint that has been replaced by /api/2.0/sql/alerts and is deprecated.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier of the alert to update |
| `name` | string | No | Name of the alert |
| `rearm` | integer | No | Number of seconds after being triggered before the alert rearms itself and can be triggered again. If not defined, alert will never be triggered again |
| `parent` | string | No | The identifier of the workspace folder containing the alert. Format: folders/<folder_id> |
| `options` | object | Yes | Alert configuration options including column, operator, and comparison value (required for update) |
| `query_id` | string | Yes | ID of the query evaluated by the alert (required for update) |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update SQL Alert

**Slug:** `DATABRICKS_SQL_ALERTS_UPDATE`

Tool to update an existing Databricks SQL alert using partial update with field mask. Use when you need to modify alert properties including display name, query reference, trigger conditions, notification settings, or ownership.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | UUID identifying the alert to update |
| `alert` | object | No | Alert object containing fields to update. |
| `update_mask` | string | Yes | Comma-separated field names to update (no spaces). Use dot notation for nested fields (e.g., 'condition.op'). Field names must match resource field names exactly |
| `auto_resolve_display_name` | boolean | No | When true, automatically resolves display name conflicts; when false, fails if conflict exists |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete SQL Dashboard

**Slug:** `DATABRICKS_SQL_DASHBOARDS_DELETE`

Tool to delete a legacy Databricks SQL dashboard by moving it to trash (soft delete). Use when you need to remove a dashboard from active use. The dashboard is moved to trash and can be restored later through the UI. Trashed dashboards do not appear in searches and cannot be shared.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier (UUID) of the legacy SQL dashboard to delete. The dashboard will be moved to trash (soft delete) and can be restored later. Trashed dashboards do not appear in list views or searches and cannot be shared until restored |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get SQL Dashboard

**Slug:** `DATABRICKS_SQL_DASHBOARDS_GET`

Tool to retrieve complete legacy dashboard definition with metadata, widgets, and queries. Use when you need to get detailed information about a SQL dashboard. Note: Legacy dashboards API deprecated as of January 12, 2026. Databricks recommends using AI/BI dashboards (Lakeview API) for new implementations.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier (UUID) of the legacy SQL dashboard to retrieve. This ID identifies the dashboard across Databricks REST API services |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update SQL Dashboard

**Slug:** `DATABRICKS_SQL_DASHBOARDS_UPDATE`

Tool to update legacy Databricks SQL dashboard attributes (name, run_as_role, tags). Use when you need to modify dashboard metadata. Note: This operation only affects dashboard object attributes and does NOT add, modify, or remove widgets.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier (UUID) of the legacy SQL dashboard to update. This operation updates dashboard attributes only and does not modify widgets |
| `name` | string | No | The title of the dashboard that appears in list views and at the top of the dashboard page |
| `tags` | array | No | Tag labels to associate with the dashboard for organization and filtering |
| `run_as_role` | string | No | Sets the Run as role for the dashboard. Must be 'viewer' (run as viewer) or 'owner' (run as owner). This determines whose permissions are used when executing queries |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get SQL Object Permissions

**Slug:** `DATABRICKS_SQL_DBSQL_PERMISSIONS_GET`

Tool to retrieve the access control list for legacy DBSQL (Redash-based) objects including alerts, dashboards, and queries. Use when you need to check who has access to these legacy SQL objects and their permission levels. IMPORTANT: This API is deprecated as of January 2026. For permissions on modern Lakeview dashboards or other workspace objects, use the IAM Permissions API (DATABRICKS_IAM_PERMISSIONS_GET) instead. Legacy DBSQL objects are no longer directly accessible and should be migrated to AI/BI dashboards.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `object_id` | string | Yes | The UUID identifying the specific object |
| `object_type` | string | Yes | The type of legacy DBSQL object for which to retrieve permissions. Valid values: alerts, dashboards, queries (for legacy Redash-based DBSQL objects only) |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Set SQL Object Permissions

**Slug:** `DATABRICKS_SQL_DBSQL_PERMISSIONS_SET`

Tool to set access control list for legacy SQL objects (alerts, dashboards, queries, or data_sources). Use when you need to configure permissions for legacy SQL objects created via the /preview/sql endpoints. IMPORTANT: - This operation REPLACES ALL existing permissions. To retain existing permissions, include them in the access_control_list. - This is a DEPRECATED API; Databricks recommends using the Workspace API (/api/2.0/permissions) instead. - Only works with LEGACY SQL objects: Legacy SQL queries (/preview/sql/queries), legacy Redash-based SQL dashboards (NOT Lakeview dashboards), legacy SQL alerts, and SQL warehouses (data_sources). - Does NOT work with modern Lakeview dashboards (/lakeview/dashboards) or modern SQL queries (/sql/queries).

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `object_id` | string | Yes | Unique identifier of the SQL object for which to set permissions |
| `object_type` | string ("alerts" | "dashboards" | "queries" | "data_sources") | Yes | Type of legacy SQL object. Valid values: alerts (legacy SQL alerts), dashboards (legacy Redash-based SQL dashboards, NOT Lakeview dashboards), queries (legacy SQL queries), data_sources (SQL warehouses). Note: This API only works with objects created via the legacy /preview/sql endpoints |
| `access_control_list` | array | Yes | Array of access control entries. This operation REPLACES ALL existing permissions with the specified list. To retain existing permissions, include them in this array. Each entry must specify exactly one of user_name, group_name, or service_principal_name |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create SQL Query

**Slug:** `DATABRICKS_SQL_QUERIES_CREATE`

Tool to create a saved SQL query object in Databricks. Use when you need to create a new saved query definition that includes the target SQL warehouse, query text, name, description, tags, and parameters. Note: This creates a saved query object, not an immediate execution. Use Statement Execution API for immediate query execution.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `query` | object | No | Query configuration object. |
| `auto_resolve_display_name` | boolean | No | If true, automatically resolve query display name conflicts. Otherwise, fail the request if the query's display name conflicts with an existing query's display name |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete SQL Query

**Slug:** `DATABRICKS_SQL_QUERIES_DELETE`

Tool to delete a Databricks SQL query (soft delete to trash). Use when you need to remove a query from searches and list views. The query is moved to trash and can be restored through the UI within 30 days, after which it is permanently deleted.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier (UUID) of the SQL query to delete. The query will be moved to trash (soft delete) and can be restored through the UI within 30 days |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get SQL Query Details

**Slug:** `DATABRICKS_SQL_QUERIES_GET`

Tool to retrieve detailed information about a specific SQL query by its UUID. Use when you need to get query configuration including SQL text, warehouse ID, parameters, ownership, and metadata.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier (UUID) of the SQL query to retrieve |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Legacy SQL Query

**Slug:** `DATABRICKS_SQL_QUERIES_LEGACY_CREATE`

Tool to create a new SQL query definition using the legacy API. Use when you need to create queries with the legacy /preview/sql/queries endpoint that uses data_source_id. Note: This is a legacy endpoint. The API has been replaced by /api/2.0/sql/queries which uses warehouse_id instead of data_source_id.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | The title/display name of the query that appears in list views, widget headings, and on the query page. This is a required field and cannot be empty |
| `query` | string | Yes | The text of the SQL query to be executed |
| `options` | object | No | Query configuration options for legacy SQL queries. |
| `is_draft` | boolean | No | Whether this query is a draft. Queries are created as drafts by default (true) |
| `schedule` | string | No | Execution schedule interval in seconds, if the query should run on a schedule |
| `description` | string | No | General description that conveys additional information about this query such as usage notes |
| `is_archived` | boolean | No | Whether this query is archived/hidden from indexes and search results |
| `data_source_id` | string | Yes | The ID of the data source (SQL warehouse) used by the query. Must be a valid UUID format (36 characters). You can obtain this from the LIST_SQL_WAREHOUSES action or the legacy Data Sources API endpoint |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Legacy SQL Query

**Slug:** `DATABRICKS_SQL_QUERIES_LEGACY_DELETE`

Tool to delete a legacy SQL query (soft delete to trash). Use when you need to remove a legacy query from searches and list views. The query is moved to trash and permanently deleted after 30 days. Note: This is a deprecated legacy API that will be phased out; use the non-legacy endpoint instead.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier (UUID) of the SQL query to delete. The query will be moved to trash (soft delete) and permanently deleted after 30 days |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Legacy SQL Query

**Slug:** `DATABRICKS_SQL_QUERIES_LEGACY_GET`

Tool to retrieve details of a specific legacy SQL query by its UUID. Use when you need to get information about a legacy query including its SQL text, parameters, configuration, and metadata. Note: This is a legacy endpoint (/api/2.0/preview/sql/queries) that has been replaced by /api/2.0/sql/queries and will be supported for six months to allow migration time.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier (UUID) of the SQL query to retrieve |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Restore SQL Query (Legacy)

**Slug:** `DATABRICKS_SQL_QUERIES_LEGACY_RESTORE`

Tool to restore a trashed SQL query to active state. Use when you need to recover a deleted query within 30 days of deletion. Once restored, the query reappears in list views and searches and can be used for alerts again. This is a legacy/deprecated API endpoint.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier of the SQL query to restore from trash. The query must be in trash state and not permanently deleted |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Legacy SQL Query

**Slug:** `DATABRICKS_SQL_QUERIES_LEGACY_UPDATE`

Tool to update an existing SQL query definition using the legacy API. Use when you need to modify queries with the legacy /preview/sql/queries endpoint. Note: This is a legacy/deprecated endpoint. The newer API uses PATCH /api/2.0/sql/queries/{id} instead.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier (UUID) of the SQL query to update |
| `name` | string | No | The title/display name of the query that appears in list views, widget headings, and on the query page |
| `tags` | array | No | Tags for the query |
| `query` | string | No | The text of the SQL query to be executed |
| `parent` | string | No | The identifier of the workspace folder containing the object |
| `options` | object | No | Query configuration options for legacy SQL queries. |
| `description` | string | No | General description that conveys additional information about this query such as usage notes |
| `run_as_role` | string | No | Sets the Run as role for the object. Must be set to one of 'viewer' (run as viewer) or 'owner' (run as owner) |
| `data_source_id` | string | No | Data source ID maps to the ID of the data source used by the resource and is distinct from the warehouse ID |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update SQL Query

**Slug:** `DATABRICKS_SQL_QUERIES_UPDATE`

Tool to update a saved SQL query object in Databricks using partial field updates. Use when you need to modify specific fields of an existing query without replacing the entire object. Requires update_mask parameter to specify which fields to update. Supports updating query text, configuration, parameters, and metadata.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier (UUID) of the SQL query to update |
| `query` | object | No | Query configuration object for update operation. |
| `update_mask` | string | Yes | Comma-separated list of fields to update with NO spaces. Use dot notation for nested fields (e.g., 'display_name,description,query_text'). Specify only the fields you want to update |
| `auto_resolve_display_name` | boolean | No | If true, automatically resolve query display name conflicts. If false or not specified, the request fails on naming conflicts |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List SQL Query History

**Slug:** `DATABRICKS_SQL_QUERY_HISTORY_LIST`

Tool to retrieve the history of SQL queries executed against SQL warehouses and serverless compute. Use when you need to list queries by time range, status, user, or warehouse. Returns most recently started queries first (up to max_results). Supports filtering and pagination.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `filter_by` | object | No | Filter object to limit query history results. |
| `page_token` | string | No | Token to get the next page of results. Obtained from previous response for pagination |
| `max_results` | integer | No | Limit the number of results returned in one page. Must be less than 1000. Defaults to 100 |
| `include_metrics` | boolean | No | Whether to include query metrics with each query. Recommended for small result sets only due to response size. Defaults to false |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create SQL Query Visualization

**Slug:** `DATABRICKS_SQL_QUERY_VISUALIZATIONS_CREATE`

Tool to create a new visualization for a Databricks SQL query. Use when you need to add a visual representation (table, chart, counter, funnel, or pivot table) to an existing saved query. The visualization will be attached to the specified query and can be added to dashboards.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `visualization` | object | Yes | The visualization configuration object containing all visualization settings |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Legacy SQL Query Visualization

**Slug:** `DATABRICKS_SQL_QUERY_VISUALIZATIONS_LEGACY_CREATE`

Tool to create a visualization in a SQL query using the legacy API. Use when you need to add a visual representation (table, chart, counter, pivot, etc.) to an existing saved query. Note: This is a deprecated endpoint; users should migrate to the current /api/2.0/sql/visualizations API. Databricks does not recommend modifying visualization settings in JSON.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | No | The name of the visualization that appears on dashboards and the query screen |
| `type` | string | Yes | The type/category of visualization (e.g., CHART, TABLE, PIVOT, COUNTER, etc.) |
| `options` | object | Yes | Visualization-specific configuration object that varies widely depending on the visualization type. Databricks does not recommend modifying this object directly in JSON as its structure is unsupported and varies by visualization type |
| `query_id` | string | Yes | The identifier of the query to which the visualization will be added. This identifier is obtained from the queries/create endpoint |
| `description` | string | No | A short description of the visualization. This is not displayed in the UI |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Legacy SQL Query Visualization

**Slug:** `DATABRICKS_SQL_QUERY_VISUALIZATIONS_LEGACY_DELETE`

Tool to permanently delete a legacy SQL query visualization. Use when you need to remove a visualization from a SQL query using the legacy API endpoint. Note: This is a deprecated legacy endpoint. Databricks recommends migrating to /api/2.0/sql/visualizations/{id} instead.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier (UUID) of the visualization to permanently delete. This is the widget/visualization ID returned by the queryvisualizations/create endpoint |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Legacy SQL Query Visualization

**Slug:** `DATABRICKS_SQL_QUERY_VISUALIZATIONS_LEGACY_UPDATE`

Tool to update a visualization in a SQL query using the legacy API. Use when you need to modify visualization properties such as name, description, type, and options. Note: This is a deprecated endpoint; users should migrate to the current queryvisualizations/update method. Databricks does not recommend modifying visualization settings in JSON.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The UUID identifier of the visualization to update |
| `name` | string | No | The name of the visualization that appears on dashboards and the query screen |
| `type` | string | No | The type of visualization: chart, table, pivot table, and so on |
| `options` | object | No | The options object varies widely from one visualization type to the next and is unsupported. Databricks does not recommend modifying this |
| `description` | string | No | A short description of this visualization. This is not displayed in the UI |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update SQL Query Visualization

**Slug:** `DATABRICKS_SQL_QUERY_VISUALIZATIONS_UPDATE`

Tool to update an existing Databricks SQL query visualization using partial update with field mask. Use when you need to modify visualization properties. Only two fields are updatable: display_name (the name shown in UI) and type (TABLE, CHART, COUNTER, FUNNEL, or PIVOT). The visualization type can be changed to reorganize how query results are displayed.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | UUID identifying the visualization to update |
| `update_mask` | string | Yes | Comma-separated field names to update (no spaces). Field names must match resource field names exactly. Only these fields are updatable: display_name, type. Do not use wildcards (*) |
| `visualization` | object | No | Visualization object containing fields to update. Only display_name and type are updatable via this API. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Redash V2 Config

**Slug:** `DATABRICKS_SQL_REDASH_CONFIG_GET_CONFIG`

Tool to retrieve workspace configuration for Redash V2 in Databricks SQL. Use when you need to get Redash configuration settings for the current workspace.

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Cancel SQL Statement Execution

**Slug:** `DATABRICKS_SQL_STATEMENT_EXEC_CANCEL_EXEC`

Tool to cancel an executing SQL statement on a Databricks warehouse. Use when you need to terminate a running SQL query. The response indicates successful receipt of the cancel request, but does not guarantee cancellation. Callers must poll the statement status to confirm the terminal state (CANCELED, SUCCEEDED, FAILED, or CLOSED).

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `statement_id` | string | Yes | The statement ID returned upon successfully submitting a SQL statement. Required reference for canceling the execution. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Execute SQL Statement

**Slug:** `DATABRICKS_SQL_STATEMENT_EXEC_EXECUTE_STATEMENT`

Execute a SQL statement on a Databricks SQL warehouse. Returns results inline if the query completes within the wait timeout, otherwise returns a statement_id to poll for results. Use this to run SQL queries against your Databricks warehouse. For large result sets, use disposition=EXTERNAL_LINKS and fetch chunks separately.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `format` | string | No | The output format for result data. JSON_ARRAY (default), ARROW_STREAM, or CSV. |
| `schema` | string | No | Sets the default schema for the statement execution. |
| `catalog` | string | No | Sets the default catalog for the statement execution. |
| `row_limit` | integer | No | Maximum number of rows to return. Alternative to adding a LIMIT clause to your SQL. |
| `statement` | string | Yes | The SQL statement to execute. Maximum size is 16 MiB. Can include named parameters referenced as :param_name. |
| `byte_limit` | integer | No | Byte limit applied to the result size. Based on internal data representations and may not match the final format size. |
| `parameters` | array | No | A list of named parameters for parameterized SQL. Reference them in the SQL as :name. Positional parameters (?) are not supported. |
| `query_tags` | array | No | An array of query tags to annotate the SQL statement. Maximum 20 tags. Each tag has a key and an optional value. |
| `disposition` | string | No | How result data is fetched. INLINE (default, <=25 MiB in response) or EXTERNAL_LINKS (URLs to fetch chunks, <=100 GiB). |
| `wait_timeout` | string | No | Timeout for waiting on execution results. '0s' returns immediately (async). '5s' to '50s' for sync wait. Default is '10s'. |
| `warehouse_id` | string | Yes | The ID of the SQL warehouse to execute the statement on. Found in the HTTP path field of your warehouse as the string after /sql/1.0/warehouses/. |
| `on_wait_timeout` | string | No | Action when wait_timeout is reached. CONTINUE (default, keeps running in background) or CANCEL (terminates the statement). |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get SQL Statement Result Chunk

**Slug:** `DATABRICKS_SQL_STATEMENT_EXEC_GET_RESULT_CHUNK`

Get a specific chunk of results from a SQL statement execution. Use this to paginate through large result sets. The chunk_index is zero-based. Use the manifest from the execute_statement or get_statement response to determine the total number of chunks available.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `chunk_index` | integer | Yes | The zero-based index of the chunk to retrieve. Use the manifest from the execute or get_statement response to determine available chunks. |
| `statement_id` | string | Yes | The statement ID returned from executing a SQL statement. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get SQL Statement Status

**Slug:** `DATABRICKS_SQL_STATEMENT_EXEC_GET_STATEMENT`

Get the status, manifest, and first chunk of results for a SQL statement execution. Use this to poll for completion after executing a statement asynchronously. The statement result is available for one hour after completion. Returns HTTP 404 if the statement has been in a terminal state for more than 12 hours.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `statement_id` | string | Yes | The statement ID returned from executing a SQL statement. Used to poll for status and retrieve results. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete SQL Warehouse

**Slug:** `DATABRICKS_SQL_WAREHOUSES_DELETE`

Deletes a SQL warehouse from the Databricks workspace. Use this tool to permanently remove a SQL warehouse (compute resource) that is no longer needed. The warehouse must exist and you must have appropriate permissions to delete it. Important notes: - Deleted warehouses may be restored within 14 days by contacting Databricks support - The operation is idempotent - deleting an already deleted warehouse will succeed - This is a destructive operation and cannot be undone through the API

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | Required. The unique identifier (endpoint ID) of the SQL warehouse to delete. Must be a valid warehouse ID obtained from listing warehouses or warehouse creation. The ID is typically a hexadecimal string. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Edit SQL Warehouse

**Slug:** `DATABRICKS_SQL_WAREHOUSES_EDIT`

Tool to update the configuration of an existing SQL warehouse. Use when you need to modify warehouse settings like cluster size, scaling parameters, auto-stop behavior, or enable features like Photon acceleration and serverless compute. The warehouse is identified by its ID, and you can update various properties including resource allocation and performance optimizations.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | Required. The warehouse ID to edit/configure |
| `name` | string | No | Logical name for the warehouse. Must be unique within the workspace and less than 100 characters |
| `tags` | object | No | Tags for the warehouse. |
| `channel` | object | No | Channel information for the warehouse. |
| `cluster_size` | string | No | Size of the clusters allocated for this warehouse. Available sizes: 2X-Small, X-Small, Small, Medium, Large, X-Large, 2X-Large, 3X-Large, 4X-Large |
| `enable_photon` | boolean | No | Configures whether the warehouse should use Photon optimized clusters. Defaults to false |
| `auto_stop_mins` | integer | No | The amount of time in minutes that a SQL warehouse must be idle (no RUNNING queries) before it is automatically stopped. Supports values == 0 or >= 10 mins. Defaults to 120 mins. Note: Using the API, you can set this as low as 1 minute for serverless warehouses |
| `warehouse_type` | string ("PRO" | "CLASSIC" | "TYPE_UNSPECIFIED") | No | SQL warehouse type designation. PRO is required for serverless compute |
| `max_num_clusters` | integer | No | Maximum number of clusters that the autoscaler will create to handle concurrent queries. Must be <= 40 and must exceed min_num_clusters |
| `min_num_clusters` | integer | No | Minimum number of available clusters that will be maintained for this SQL warehouse. Must be > 0 and <= min(max_num_clusters, 30). Defaults to 1 |
| `instance_profile_arn` | string | No | (Deprecated) AWS only - The instance profile used to pass an IAM role to the SQL warehouse |
| `spot_instance_policy` | string ("COST_OPTIMIZED" | "RELIABILITY_OPTIMIZED" | "POLICY_UNSPECIFIED") | No | Configurations for spot instance usage. COST_OPTIMIZED uses spot instances when possible, RELIABILITY_OPTIMIZED uses on-demand instances |
| `enable_serverless_compute` | boolean | No | Configures whether the warehouse should use serverless compute |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get SQL Warehouse Details

**Slug:** `DATABRICKS_SQL_WAREHOUSES_GET`

Tool to retrieve detailed information about a specific SQL warehouse by its ID. Use when you need to get configuration, state, connection details, and resource allocation for a SQL warehouse. Returns comprehensive warehouse information including cluster settings, JDBC/ODBC connection strings, and health status.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier of the SQL warehouse to retrieve |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get SQL Warehouse Permission Levels

**Slug:** `DATABRICKS_SQL_WAREHOUSES_GET_PERMISSION_LEVELS`

Tool to retrieve available permission levels for a Databricks SQL warehouse. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific SQL warehouse. Returns permission levels like CAN_USE, CAN_MANAGE, IS_OWNER, CAN_VIEW, and CAN_MONITOR with their descriptions.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `warehouse_id` | string | Yes | The unique identifier of the SQL warehouse for which to retrieve available permission levels |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get SQL Warehouse Permissions

**Slug:** `DATABRICKS_SQL_WAREHOUSES_GET_PERMISSIONS`

Tool to retrieve permissions for a Databricks SQL warehouse. Use when you need to check who has access to a specific SQL warehouse and their permission levels. Returns the access control list with user, group, and service principal permissions, including inherited permissions from parent objects.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `warehouse_id` | string | Yes | The unique identifier (UUID) of the SQL warehouse for which to retrieve permissions |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Workspace Warehouse Config

**Slug:** `DATABRICKS_SQL_WAREHOUSES_GET_WORKSPACE_WAREHOUSE_CONFIG`

Tool to retrieve workspace-level SQL warehouse configuration settings. Use when you need to check security policies, serverless compute settings, channel versions, or warehouse type restrictions that apply to all SQL warehouses in the workspace.

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Set SQL Warehouse Permissions

**Slug:** `DATABRICKS_SQL_WAREHOUSES_SET_PERMISSIONS`

Tool to set permissions for a Databricks SQL warehouse, replacing all existing permissions. Use when you need to configure access control for a SQL warehouse. This operation is authoritative and overwrites all existing permissions. Exactly one IS_OWNER must be specified. Groups cannot have IS_OWNER permission.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `warehouse_id` | string | Yes | The unique identifier (UUID) of the SQL warehouse for which to set permissions |
| `access_control_list` | array | Yes | Array of access control entries. This operation is authoritative and replaces ALL existing permissions. Must include exactly one IS_OWNER permission. For each entry, specify only one of user_name, group_name, or service_principal_name |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Set Workspace Warehouse Config

**Slug:** `DATABRICKS_SQL_WAREHOUSES_SET_WORKSPACE_WAREHOUSE_CONFIG`

Tool to configure workspace-level SQL warehouse settings shared by all SQL warehouses. Use when you need to set security policies, enable serverless compute, configure channel versions, or manage warehouse type restrictions across the workspace.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `channel` | object | No | Channel information for SQL warehouses. |
| `security_policy` | string ("DATA_ACCESS_CONTROL" | "NONE" | "PASSTHROUGH") | No | Security policy for warehouses. DATA_ACCESS_CONTROL enforces role-based access restrictions, NONE applies no restrictions, PASSTHROUGH allows credentials to pass through to underlying data sources |
| `data_access_config` | array | No | Spark confs for external hive metastore configuration. JSON serialized size must be less than or equal to 512K |
| `instance_profile_arn` | string | No | AWS Only - The instance profile used to pass an IAM role to the SQL warehouses |
| `google_service_account` | string | No | GCP only - Google Service Account used to pass to cluster to access Google Cloud Storage |
| `enabled_warehouse_types` | array | No | List of warehouse types allowed in this workspace. Limits allowed values for the type field in CreateWarehouse and EditWarehouse. Disabling a type may cause existing warehouses to be converted to another type |
| `enable_serverless_compute` | boolean | No | Enable Serverless compute for SQL warehouses |
| `sql_configuration_parameters` | object | No | SQL configuration parameters for warehouses. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Start SQL Warehouse

**Slug:** `DATABRICKS_SQL_WAREHOUSES_START`

Tool to start a stopped Databricks SQL warehouse asynchronously. Use when you need to restart a stopped warehouse. The warehouse transitions through STARTING state before reaching RUNNING. Requires CAN MONITOR permissions or higher.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | string | Yes | The unique identifier of the SQL warehouse to start. This is the string of letters and numbers in the warehouse's HTTP path. If the warehouse is already running, the request has no effect. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update SQL Warehouse Permissions

**Slug:** `DATABRICKS_SQL_WAREHOUSES_UPDATE_PERMISSIONS`

Tool to incrementally update permissions for a Databricks SQL warehouse. Use when you need to modify specific permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request. For replacing all permissions, use SetPermissions instead.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `warehouse_id` | string | Yes | The unique identifier of the SQL warehouse for which to update permissions. This is typically an alphanumeric string (e.g., '4fc35c5d38e6ef6c') that can be obtained from the List SQL Warehouses endpoint |
| `access_control_list` | array | Yes | Array of access control entries to update. This operation updates only the specified permissions incrementally, preserving existing permissions not included in the request. For each entry, exactly one of user_name, group_name, or service_principal_name must be specified. Note: PATCH requests containing a warehouse owner (IS_OWNER) will be rejected with a NOT_IMPLEMENTED error |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Submit One-Time Run

**Slug:** `DATABRICKS_SUBMIT_RUN`

Tool to submit a one-time run without creating a job. Use when you need to execute a task directly without saving it as a job definition. After submission, use the jobs/runs/get API with the returned run_id to check the run state and monitor progress.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `tasks` | array | Yes | List of tasks to execute in the run. At least one task is required |
| `run_name` | string | No | Name for the run (default: 'Untitled') |
| `libraries` | array | No | Libraries to install on cluster (default: empty list) |
| `new_cluster` | object | No | Cluster specification to create for each run. |
| `environments` | array | No | List of environment configurations for the run |
| `timeout_seconds` | integer | No | Timeout per run (default: no timeout) |
| `idempotency_token` | string | No | Token (max 64 characters) to guarantee idempotency |
| `existing_cluster_id` | string | No | ID of an existing cluster to use for all tasks |
| `notification_settings` | object | No | Email and webhook notification preferences. |
| `webhook_notifications` | object | No | System destinations for run state changes. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Tag Policy

**Slug:** `DATABRICKS_TAGS_TAG_POLICIES_CREATE_TAG_POLICY`

Tool to create a new tag policy (governed tag) in Databricks with built-in rules for consistency and control. Use when you need to establish governed tags with restricted values and define who can assign them. Maximum of 1,000 governed tags per account. Each governed tag can have up to 50 allowed values. Requires appropriate account-level permissions.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `values` | array | No | One or more allowed values for the tag. Only these values can be assigned to this tag key. Maximum of 50 allowed values per governed tag. Each value must be a TagValue object with a 'name' field |
| `tag_key` | string | Yes | The tag key name (case sensitive). This is the governed tag identifier. No special characters are allowed. Tag keys must be unique within the account |
| `description` | string | No | A description or explanation for the governed tag policy |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Tag Policy

**Slug:** `DATABRICKS_TAGS_TAG_POLICIES_DELETE_TAG_POLICY`

Tool to delete a tag policy by its key, making the tag ungoverned. Use when you need to remove governance from a tag without deleting the tag itself. Requires MANAGE permission on the governed tag. System governed tags cannot be deleted.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `tag_key` | string | Yes | The key of the governed tag whose policy should be removed. This is the unique identifier for the tag policy to delete. When deleted, the tag becomes ungoverned but remains on objects |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Tag Policy

**Slug:** `DATABRICKS_TAGS_TAG_POLICIES_GET_TAG_POLICY`

Tool to retrieve a specific tag policy by its associated governed tag's key. Use when you need to get details about tag governance policies including allowed values and metadata.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `tag_key` | string | Yes | The unique key identifier of the governed tag whose policy you want to retrieve |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Tag Policy

**Slug:** `DATABRICKS_TAGS_TAG_POLICIES_UPDATE_TAG_POLICY`

Tool to update an existing tag policy (governed tag) with specified fields. Use when you need to modify tag policy properties like description, tag key, or allowed values. Users must have MANAGE permission on the governed tag to edit it.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `values` | array | No | A collection of allowed values for the tag. Each governed tag can have a maximum of 50 allowed values |
| `tag_key` | string | Yes | The tag key name (identifier) of the tag policy to update. Tag keys are case sensitive (e.g., 'Sales' and 'sales' are distinct). Tag keys can only contain letters, spaces, numbers, or the characters +, -, =, ., _, :, /, @. Note: tag_key is immutable and cannot be changed after creation |
| `description` | string | No | A text field describing the policy |
| `update_mask` | string | Yes | A field mask specifying which fields to update. Must be a single string with multiple fields separated by commas (no spaces). Uses dot notation for nested fields (e.g., 'description,values'). Valid updatable fields are 'description' and 'values'. A field mask of '*' indicates full replacement, though explicitly listing fields is recommended |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Databricks Job By ID

**Slug:** `DATABRICKS_UPDATE_JOB_BY_ID`

Tool to completely reset all settings for a Databricks job. Use when you need to overwrite all job configuration at once. Changes to timeout_seconds apply immediately to active runs; other changes apply to future runs only. Consider using the update endpoint for partial updates instead of reset to minimize disruption.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `job_id` | integer | Yes | The canonical identifier of the job to reset. This field is required. |
| `new_settings` | object | Yes | The new settings that completely replace all existing settings for the job. Changes to timeout_seconds apply to active runs; other changes apply to future runs only. Must include at least a cluster specification (existing_cluster_id or new_cluster) and a task type (notebook_task, spark_jar_task, etc.) or tasks array. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Vector Search Endpoint

**Slug:** `DATABRICKS_VECTORSEARCH_VECTOR_SEARCH_ENDPOINTS_CREATE`

Tool to create a new vector search endpoint to host indexes in Databricks Mosaic AI Vector Search. Use when you need to provision compute resources for hosting vector search indexes. The endpoint will be in PROVISIONING state initially and transition to ONLINE when ready.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | Yes | Name of the vector search endpoint. Must be unique within the workspace |
| `endpoint_type` | string ("STANDARD" | "STORAGE_OPTIMIZED") | No | Type of endpoint. STANDARD: Up to 320M vectors, lower latency. STORAGE_OPTIMIZED: Over 1B vectors, 10-20x faster indexing, slightly higher latency (~250ms), up to 7x cheaper per vector. Choose STORAGE_OPTIMIZED for 10M+ vectors when cost efficiency is important |
| `usage_policy_id` | string | No | The id of the usage policy to assign to the endpoint |
| `budget_policy_id` | string | No | The id of the budget policy to assign to the endpoint. Deprecated, use usage_policy_id instead |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Vector Search Index

**Slug:** `DATABRICKS_VECTORSEARCH_VECTOR_SEARCH_INDEXES_DELETE_INDEX`

Tool to delete a vector search index from Databricks workspace. Use when you need to remove unused or obsolete vector search indexes. When an index is deleted, any associated writeback tables are automatically removed. This operation is irreversible.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `index_name` | string | Yes | The fully qualified name of the vector search index to delete. Must be in the format: catalog.schema.index_name (e.g., 'test_catalog.test_schema.test_index'). Deleting an index will also automatically delete any associated writeback tables. This operation is irreversible. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Query Vector Search Index

**Slug:** `DATABRICKS_VECTORSEARCH_VECTOR_SEARCH_INDEXES_QUERY_INDEX`

Tool to query vector search index to find similar vectors and return associated documents. Use when performing similarity search, hybrid keyword-similarity search, or full-text search on Databricks Vector Search indexes. Supports filtering, reranking, and returns configurable columns from matched documents with similarity scores. Must provide either query_vector or query_text.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `columns` | array | Yes | List of column names to include in the response. These columns will be returned along with the similarity scores for matched documents |
| `filters` | string | No | Filter conditions to apply to results. Use dictionary format for standard endpoints (e.g., {'field': value}), or SQL string format for storage-optimized endpoints (e.g., "field = 'value'") |
| `reranker` | object | No | Reranking configuration object. |
| `index_name` | string | Yes | Name of the vector index to query. Use the full three-level name format (catalog.schema.index_name) |
| `query_text` | string | No | Text query to search with. Required for Delta Sync Index using model endpoint. Automatically converted to embeddings. Must provide either query_vector or query_text |
| `query_type` | string ("ANN" | "HYBRID" | "FULL_TEXT") | No | Type of search to perform. ANN (default): Approximate nearest neighbor search. HYBRID: Combines keyword and similarity search. FULL_TEXT: Full-text search mode |
| `debug_level` | integer | No | Set to ≥1 to get latency information in response. Required for reranker timing information |
| `num_results` | integer | Yes | Maximum number of results to return. Maximum is 200 for hybrid search, 10,000 total retrievable with pagination |
| `filters_json` | string | No | JSON string representing query filters (alternative format) |
| `query_vector` | array | No | Vector embeddings to search with. Required for Direct Vector Access Index and Delta Sync Index using self-managed vectors. Must provide either query_vector or query_text |
| `score_threshold` | number | No | Minimum similarity score threshold for results. Defaults to 0.0. Only results with similarity scores above this threshold will be returned |
| `columns_to_rerank` | array | No | Column names to retrieve data for sending to the reranker. This is used when reranking is desired but not using the reranker object |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Upsert Data Vector Index

**Slug:** `DATABRICKS_VECTORSEARCH_VECTOR_SEARCH_INDEXES_UPSERT_DATA`

Tool to upsert (insert or update) data into a Direct Vector Access Index. Use when you need to manually add or update vectors in a Databricks vector search index. The index must be a Direct Vector Access Index type where updates are managed via REST API or SDK calls. Data structure must match the schema defined when the index was created, including the primary key field.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `vectors` | array | Yes | Array of vector objects to upsert into the index. Each entry must contain an id (matching the primary key), vector array, and optional metadata fields that match the index schema |
| `index_name` | string | Yes | The fully qualified name of the Direct Vector Access Index where data will be upserted. Must be in the format: catalog.schema.index_name (e.g., 'main.default.test_index'). The index must be a Direct Vector Access Index type, not a Delta Sync Index |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Workspace Git Credentials

**Slug:** `DATABRICKS_WORKSPACE_GIT_CREDENTIALS_CREATE`

Tool to create Git credentials for authenticating with remote Git repositories in Databricks. Use when you need to set up Git integration for version control operations. Multiple credentials can be created, typically one per Git provider. Use the is_default_for_provider flag to set a credential as the default for its provider.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | No | Identifier used for looking up the git credential. Useful for organizing multiple credentials |
| `git_email` | string | No | The authenticating email associated with your Git provider user account. Used for authentication with the remote repository and sets the author & committer identity for commits. Either git_username or git_email must be provided for GitHub Enterprise. Recommended for all providers |
| `git_provider` | string ("gitHub" | "bitbucketCloud" | "gitLab" | "azureDevOpsServices" | "gitHubEnterprise" | "bitbucketServer" | "gitLabEnterpriseEdition" | "awsCodeCommit") | Yes | The Git provider name. Case-insensitive. Valid values include gitHub, bitbucketCloud, gitLab, azureDevOpsServices, gitHubEnterprise, bitbucketServer, gitLabEnterpriseEdition, and awsCodeCommit |
| `git_username` | string | No | The username provided with your Git provider account. Used to set the Git committer & author names for commits. Required for Bitbucket Cloud and AWS CodeCommit. Either git_username or git_email must be provided for GitHub Enterprise |
| `personal_access_token` | string | No | The personal access token used to authenticate to the corresponding Git provider. This is required for authenticating with the Git provider |
| `is_default_for_provider` | boolean | No | If true, this credential serves as the default for the given provider. Only one credential can be default per provider |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Workspace Git Credentials

**Slug:** `DATABRICKS_WORKSPACE_GIT_CREDENTIALS_DELETE`

Tool to delete Git credentials for remote repository authentication in Databricks. Use when you need to remove a Git credential entry from the workspace. Only one Git credential per user is supported in Databricks, making this useful for credential lifecycle management when credentials need to be revoked or replaced.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `credential_id` | integer | Yes | The ID for the corresponding credential to delete. This is the unique identifier of the Git credential object in the workspace |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Workspace Git Credentials

**Slug:** `DATABRICKS_WORKSPACE_GIT_CREDENTIALS_GET`

Tool to retrieve Git credentials for authenticating with remote Git repositories in Databricks. Use when you need to get details of existing Git integration credentials by credential ID.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `credential_id` | integer | Yes | The ID for the corresponding credential to access. This is the unique identifier of the credential object in the workspace |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Workspace Git Credentials

**Slug:** `DATABRICKS_WORKSPACE_GIT_CREDENTIALS_UPDATE`

Tool to update existing Git credentials for authenticating with remote Git repositories in Databricks. Use when you need to modify Git provider credentials, email, username, or access tokens. Note that the git_provider field cannot be changed after initial creation. For azureDevOpsServicesAad provider, do not specify personal_access_token or git_username.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `name` | string | No | The name of the git credential, used for identification and ease of lookup |
| `git_email` | string | No | The authenticating email associated with your Git provider user account. Used for authentication with the remote repository and sets the author & committer identity for commits. Required for most Git providers except AWS CodeCommit |
| `git_provider` | string ("gitHub" | "bitbucketCloud" | "gitLab" | "azureDevOpsServices" | "gitHubEnterprise" | "bitbucketServer" | "gitLabEnterpriseEdition" | "awsCodeCommit" | "azureDevOpsServicesAad") | Yes | The Git provider name. Case-insensitive. This field cannot be changed after initial creation. Valid values include gitHub, bitbucketCloud, gitLab, azureDevOpsServices, gitHubEnterprise, bitbucketServer, gitLabEnterpriseEdition, awsCodeCommit, and azureDevOpsServicesAad |
| `git_username` | string | No | The username provided with your Git provider account. Used to set the Git committer & author names for commits. Required for AWS CodeCommit. For azureDevOpsServicesAad provider, do NOT specify this field |
| `credential_id` | integer | Yes | The ID for the corresponding credential to access. This is the unique identifier of the credential object in the workspace |
| `personal_access_token` | string | No | The personal access token used to authenticate to the corresponding Git provider. For azureDevOpsServicesAad provider, do NOT specify this field. The token is stored securely and not returned in responses |
| `is_default_for_provider` | boolean | No | Flag indicating if the credential is the default for the given provider. Only one credential can be default per provider |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### List Workspace Directory

**Slug:** `DATABRICKS_WORKSPACE_LIST`

Tool to list the contents of a directory in Databricks workspace. Use when you need to view notebooks, files, directories, libraries, or repos at a specific path. Returns object information including paths, types, and metadata. Use object_id for setting permissions via the Permissions API.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `path` | string | Yes | The absolute path of the notebook or directory to list. Must be an absolute path starting with '/' |
| `recursive` | boolean | No | Whether to recursively traverse subdirectories. Defaults to false |
| `notebooks_modified_after` | integer | No | UTC timestamp in milliseconds to filter notebooks modified after this time |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Workspace Repo

**Slug:** `DATABRICKS_WORKSPACE_REPOS_CREATE`

Tool to create and optionally checkout a Databricks Repo linking a Git repository to the workspace. Use when you need to connect a Git repository to Databricks for collaborative development. Can optionally specify branch or tag to checkout after creation and configure sparse checkout for performance.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `tag` | string | No | Name of existing Git tag to checkout after creation |
| `url` | string | Yes | HTTPS URL of a Git repository to be linked |
| `path` | string | No | Desired path for the repo in the workspace. Almost any path in the workspace can be chosen. If repo is created in /Repos, path must be in the format /Repos/{folder}/{repo-name}. If not specified, it will be created in the user's directory |
| `branch` | string | No | Name of existing Git branch to checkout after creation |
| `provider` | string ("gitHub" | "gitHubEnterprise" | "bitbucketCloud" | "bitbucketServer" | "azureDevOpsServices" | "gitLab" | "gitLabEnterpriseEdition" | "awsCodeCommit") | No | Git provider name. Case-insensitive. Must be provided if cannot be guessed from URL. Supported values: gitHub, gitHubEnterprise, bitbucketCloud, bitbucketServer, azureDevOpsServices, gitLab, gitLabEnterpriseEdition, awsCodeCommit |
| `sparse_checkout` | object | No | Configuration for sparse checkout. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Workspace Repo

**Slug:** `DATABRICKS_WORKSPACE_REPOS_DELETE`

Tool to delete a Git repository from Databricks workspace. Use when you need to permanently remove a repository. The repository cannot be recovered after deletion completes successfully.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `repo_id` | string | Yes | The unique numeric identifier of the repository to delete. This ID can be obtained by listing repos or getting repo details |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Workspace Repo Permission Levels

**Slug:** `DATABRICKS_WORKSPACE_REPOS_GET_PERMISSION_LEVELS`

Tool to retrieve available permission levels for a Databricks workspace repository. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific Git repository. Returns permission levels like CAN_READ, CAN_RUN, CAN_EDIT, and CAN_MANAGE with their descriptions.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `repo_id` | string | Yes | The unique identifier of the repository for which to retrieve available permission levels |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Set Workspace Repo Permissions

**Slug:** `DATABRICKS_WORKSPACE_REPOS_SET_PERMISSIONS`

Tool to set permissions for a workspace repository, replacing all existing permissions. Use when you need to configure access control for a workspace repo. This operation replaces ALL existing permissions; admin users cannot have their permissions lowered. Repos can inherit permissions from their root object.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `repo_id` | string | Yes | The identifier of the repository for which to set permissions |
| `access_control_list` | array | No | Array of access control entries. This replaces ALL existing permissions. For each entry, exactly one of user_name, group_name, or service_principal_name must be specified. When omitted or empty, all direct permissions are removed |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Workspace Repo

**Slug:** `DATABRICKS_WORKSPACE_REPOS_UPDATE`

Tool to update a workspace repo to a different branch or tag. Use when you need to switch branches, pull latest changes, or update sparse checkout settings. When updating to a tag, the repo enters a detached HEAD state and must be switched back to a branch before committing.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | integer | Yes | The unique identifier of the Git folder (repo) object in the workspace |
| `tag` | string | No | Tag name that the local version of the repo should be checked out to. Updating the repo to a tag puts the repo in a detached HEAD state. Before committing new changes, you must update the repo to a branch. Exactly one of 'branch' or 'tag' must be specified |
| `branch` | string | No | Branch name that the local version of the repo should be checked out to. If this refers to the currently checked out branch, Databricks performs a pull operation to update to the latest commit. Exactly one of 'branch' or 'tag' must be specified |
| `sparse_checkout` | object | No | Configuration for sparse checkout. |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Update Workspace Repo Permissions

**Slug:** `DATABRICKS_WORKSPACE_REPOS_UPDATE_PERMISSIONS`

Tool to incrementally update permissions on a Databricks workspace repository. Use when you need to modify specific permissions for users, groups, or service principals without replacing the entire permission set. This PATCH operation only modifies the permissions specified while leaving other existing permissions intact. Repos can inherit permissions from their root object.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `repo_id` | string | Yes | The unique identifier of the repository to update permissions for |
| `access_control_list` | array | Yes | List of access control entries to update. This operation updates permissions incrementally without replacing existing permissions. At least one access control entry is required. For each entry, exactly one of user_name, group_name, or service_principal_name must be specified |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Secret Scope

**Slug:** `DATABRICKS_WORKSPACE_SECRETS_CREATE_SCOPE`

Tool to create a new secret scope in Databricks workspace. Use when you need to establish a secure location to store credentials and sensitive information. Scope names must be unique, case-insensitive, and cannot exceed 128 characters. By default, the scope is Databricks-backed with MANAGE permission for the creator.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `scope` | string | Yes | The name of the secret scope to create. Must be unique within the workspace, consist of alphanumeric characters, dashes, underscores, @, and periods, and cannot exceed 128 characters. Case insensitive |
| `scope_backend_type` | string ("DATABRICKS" | "AZURE_KEYVAULT") | No | The backend type for the scope. DATABRICKS (default) stores secrets in Databricks' own storage. AZURE_KEYVAULT is for Azure deployments. If not specified, defaults to DATABRICKS |
| `backend_azure_keyvault` | object | No | Azure Key Vault backend configuration for secret scope. |
| `initial_manage_principal` | string | No | The principal that is initially granted MANAGE permission to the created scope. The only supported value is 'users', which contains all users in the workspace. If not specified, the API request issuer's user identity receives MANAGE permission |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Secrets ACL

**Slug:** `DATABRICKS_WORKSPACE_SECRETS_DELETE_ACL`

Tool to delete an access control list from a Databricks secret scope. Use when you need to revoke permissions for a principal on a secret scope. Requires MANAGE permission on the scope. Fails if the ACL does not exist.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `scope` | string | Yes | The name of the secret scope from which to revoke access. This is the scope that contains the ACL to be removed |
| `principal` | string | Yes | The principal to remove an existing ACL from. Specify a user by email (e.g., someone@example.com), a service principal by applicationId, or a group by group name (e.g., data-scientists) |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Secret Scope

**Slug:** `DATABRICKS_WORKSPACE_SECRETS_DELETE_SCOPE`

Tool to delete a secret scope and all associated secrets and ACLs. Use when you need to permanently remove a secret scope. This operation cannot be undone. The API throws errors if the scope does not exist or the user lacks authorization.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `scope` | string | Yes | Name of the scope to delete. This operation permanently removes the scope along with all secrets stored within it and any associated access control lists (ACLs). The operation cannot be undone |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Workspace Secret

**Slug:** `DATABRICKS_WORKSPACE_SECRETS_DELETE_SECRET`

Tool to delete a secret from a Databricks secret scope. Use when you need to remove a secret stored in a scope. Requires WRITE or MANAGE permission on the scope. Not supported for Azure KeyVault-backed scopes.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `key` | string | Yes | Name of the secret to delete. This is the unique identifier for the secret within the specified scope |
| `scope` | string | Yes | The name of the scope that contains the secret to delete. This identifies the secret scope where the secret is stored |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Secrets ACL

**Slug:** `DATABRICKS_WORKSPACE_SECRETS_GET_ACL`

Tool to retrieve ACL details for a principal on a Databricks secret scope. Use when you need to check the permission level granted to a specific user, service principal, or group. Requires MANAGE permission on the scope. Each permission level is hierarchical - WRITE includes READ, and MANAGE includes both WRITE and READ.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `scope` | string | Yes | The name of the secret scope to fetch ACL information from |
| `principal` | string | Yes | The principal to fetch ACL information for. Specify a user by email address (e.g., someone@example.com), a service principal by its applicationId value, or a group by its group name (e.g., data-scientists) |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Secret Value

**Slug:** `DATABRICKS_WORKSPACE_SECRETS_GET_SECRET`

Tool to get a secret value from a Databricks secret scope. Use when you need to retrieve the actual value of a secret stored in a scope. Important: This API can only be called from the DBUtils interface (from within a cluster/notebook). There is no API to read the actual secret value outside of a cluster. Requires READ permission on the scope.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `key` | string | Yes | Name of the secret to fetch value information. The key to fetch the secret for within the specified scope |
| `scope` | string | Yes | The name of the scope that contains the secret. This is the name of the secret scope to fetch secret information from |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Put Secrets ACL

**Slug:** `DATABRICKS_WORKSPACE_SECRETS_PUT_ACL`

Tool to create or overwrite access control list for a principal on a Databricks secret scope. Use when you need to grant or modify permissions for a user, group, or service principal on a secret scope. Requires MANAGE permission on the scope. Overwrites existing permission level for the principal if one already exists.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `scope` | string | Yes | The name of the secret scope to apply permissions to. Must reference an existing secret scope |
| `principal` | string | Yes | The principal (user or group) to receive the permission. Specify a user by email address (e.g., someone@example.com), a service principal by its applicationId value, or a group by its group name (e.g., users, data-scientists). Must correspond to an existing Databricks principal |
| `permission` | string ("MANAGE" | "WRITE" | "READ") | Yes | The permission level to apply to the principal. MANAGE: allowed to change ACLs, read and write to this secret scope. WRITE: allowed to read and write to this secret scope. READ: allowed to read this secret scope and list available secrets |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Put Secret in Scope

**Slug:** `DATABRICKS_WORKSPACE_SECRETS_PUT_SECRET`

Tool to insert or update a secret in a Databricks secret scope. Use when you need to store sensitive information like passwords, API keys, or credentials. Overwrites existing secrets with the same key. Requires WRITE or MANAGE permission on the scope. Maximum 1,000 secrets per scope with 128 KB limit per secret.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `key` | string | Yes | A unique name to identify the secret. Must consist of alphanumeric characters, dashes, underscores, and periods, and cannot exceed 128 characters |
| `scope` | string | Yes | The name of the scope to which the secret will be associated with. Must consist of alphanumeric characters, dashes, underscores, and periods, and cannot exceed 128 characters |
| `bytes_value` | string | No | If specified, value will be stored as bytes. Maximum size 128 KB. Exactly one of string_value or bytes_value must be specified |
| `string_value` | string | No | If specified, the value will be stored in UTF-8 (MB4) form. Maximum size 128 KB. Exactly one of string_value or bytes_value must be specified |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Delete Workspace Object

**Slug:** `DATABRICKS_WORKSPACE_WORKSPACE_DELETE`

Tool to permanently delete a workspace object or directory. Use when you need to remove notebooks, files, or directories from the workspace. This is a hard delete operation that cannot be undone. Recursive deletion of non-empty directories is not atomic and may partially complete if it fails.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `path` | string | Yes | The absolute path of the notebook or directory to delete. Must be an absolute path starting with '/' (e.g., '/Users/user@example.com/project') |
| `recursive` | boolean | No | Whether to delete the object recursively. Must be set to true when deleting non-empty directories. Defaults to false. WARNING: Recursive directory deletion is not atomic - if it fails partway through, some objects may be deleted and cannot be recovered |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Export Workspace Object

**Slug:** `DATABRICKS_WORKSPACE_WORKSPACE_EXPORT`

Tool to export a workspace object (notebook, dashboard, or file) as file content or base64-encoded string. Use when you need to retrieve the content of workspace objects for backup, migration, or analysis. By default, returns base64-encoded content with file type information. Set direct_download=true to get raw file content directly.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `path` | string | Yes | The absolute path of the workspace object to export. Must be a valid workspace path |
| `format` | string ("SOURCE" | "HTML" | "JUPYTER" | "DBC" | "R_MARKDOWN" | "AUTO") | No | Specifies the exported file format (case-sensitive). SOURCE exports notebooks as source code, HTML exports as HTML files, JUPYTER exports as Jupyter/IPython Notebook format, DBC exports in Databricks archive format, R_MARKDOWN exports to R Markdown format, AUTO exports based on automatic object type detection. Defaults to SOURCE if not specified |
| `direct_download` | boolean | No | When true, response is the exported file content directly. When false (default), content is returned as base64-encoded string with file_type field |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Get Workspace Object Status

**Slug:** `DATABRICKS_WORKSPACE_WORKSPACE_GET_STATUS`

Tool to retrieve status and metadata for any workspace object including notebooks, directories, dashboards, and files. Use when you need to get object type, path, identifier, and additional metadata fields. Returns error with code RESOURCE_DOES_NOT_EXIST if the specified path does not exist.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `path` | string | Yes | The absolute path of the notebook, directory, dashboard, or file in the workspace. Must be an absolute path starting with '/' (e.g., '/Users/me@example.com/MyFolder' or '/Users/first.last@example.com/examples_folder/myseconddashboard.lvdash.json') |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Import Workspace Object

**Slug:** `DATABRICKS_WORKSPACE_WORKSPACE_IMPORT`

Import a notebook or file into the Databricks workspace from base64-encoded content. Supports multiple formats: SOURCE (requires language parameter), JUPYTER, R_MARKDOWN, HTML, DBC, or AUTO for automatic detection. For SOURCE format with single files, you must specify the language (PYTHON, R, SCALA, or SQL). Maximum content size: 10 MB. Use overwrite=true to replace existing objects at the same path.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `path` | string | Yes | The absolute path of the object or directory where content will be imported in the workspace. Must be an absolute path prefixed with '/' |
| `format` | string ("SOURCE" | "HTML" | "JUPYTER" | "DBC" | "R_MARKDOWN" | "AUTO") | No | Specifies the format of the file to be imported. Use AUTO to automatically detect the asset type (recommended for AI/BI dashboards with .lvdash.json extension). For directories: use DBC or SOURCE format with language field unset. For single files as SOURCE: must set the language field. Defaults to SOURCE if not specified |
| `content` | string | No | Base64-encoded content to be imported. Maximum size: 10 MB. Required for most import operations. If this limit is exceeded, the API throws MAX_NOTEBOOK_SIZE_EXCEEDED exception. For notebooks, the content should be base64-encoded source code |
| `language` | string ("PYTHON" | "SQL" | "SCALA" | "R") | No | The language of the object. This value is set only if the object type is NOTEBOOK. Required when importing a single file as SOURCE format. Should be unset when importing directories |
| `overwrite` | boolean | No | Flag that specifies whether to overwrite existing object. Default: false. Note: For DBC format, overwrite is not supported since it may contain a directory. If path already exists and overwrite is false, the API returns RESOURCE_ALREADY_EXISTS error |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |

### Create Workspace Directory

**Slug:** `DATABRICKS_WORKSPACE_WORKSPACE_MKDIRS`

Tool to create a directory and necessary parent directories in the workspace. Use when you need to create new directories. The operation is idempotent - if the directory already exists, the command succeeds without action. Returns error RESOURCE_ALREADY_EXISTS if a file (not directory) exists at any prefix of the path.

#### Input Parameters

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `path` | string | Yes | The absolute path of the directory to create. Must be an absolute path prefixed with '/'. If parent directories do not exist, they will be created automatically |

#### Output

| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `data` | string | Yes | Data from the action execution |
| `error` | string | No | Error if any occurred during the execution of the action |
| `successful` | boolean | Yes | Whether or not the action execution was successful or not |
