Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .codegen/_openapi_sha
Original file line number Diff line number Diff line change
@@ -1 +1 @@
8685a7d0216270e9c2b1f66e5917ee272899a315
b5283a925ecf8929263b63f41b182246745d81a0
3 changes: 3 additions & 0 deletions NEXT_CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,3 +36,6 @@
* Add `access_modes` and `storage_location` fields for `databricks.sdk.service.sharing.Table`.
* [Breaking] Remove `current_state`, `default`, `effective_default`, `effective_is_protected`, `effective_source_branch`, `effective_source_branch_lsn`, `effective_source_branch_time`, `is_protected`, `logical_size_bytes`, `pending_state`, `source_branch`, `source_branch_lsn`, `source_branch_time` and `state_change_time` fields for `databricks.sdk.service.postgres.Branch`.
* [Breaking] Remove `branch_logical_size_limit_bytes`, `compute_last_active_time`, `default_endpoint_settings`, `display_name`, `effective_default_endpoint_settings`, `effective_display_name`, `effective_history_retention_duration`, `effective_pg_version`, `effective_settings`, `history_retention_duration`, `pg_version`, `settings` and `synthetic_storage_size_bytes` fields for `databricks.sdk.service.postgres.Project`.
* Add `command` and `env_vars` fields for `databricks.sdk.service.apps.AppDeployment`.
* Add `full_name` and `securable_type` fields for `databricks.sdk.service.catalog.AccessRequestDestinations`.
* [Breaking] Change `delete_kafka_config()` method for [w.feature_engineering](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/ml/feature_engineering.html) workspace-level service . Method path has changed.
58 changes: 58 additions & 0 deletions databricks/sdk/service/apps.py

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

18 changes: 18 additions & 0 deletions databricks/sdk/service/catalog.py

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

17 changes: 11 additions & 6 deletions databricks/sdk/service/ml.py

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

4 changes: 2 additions & 2 deletions docs/account/iam/workspace_assignment.rst
Original file line number Diff line number Diff line change
Expand Up @@ -43,9 +43,9 @@

a = AccountClient()

workspace_id = os.environ["DUMMY_WORKSPACE_ID"]
workspace_id = os.environ["TEST_WORKSPACE_ID"]

all = a.workspace_assignment.list(workspace_id=workspace_id)
all = a.workspace_assignment.list(list=workspace_id)

Get the permission assignments for the specified Databricks account and Databricks workspace.

Expand Down
4 changes: 4 additions & 0 deletions docs/dbdataclasses/apps.rst
Original file line number Diff line number Diff line change
Expand Up @@ -423,6 +423,10 @@ These dataclasses are used in the SDK to represent API requests and responses fo
:members:
:undoc-members:

.. autoclass:: EnvVar
:members:
:undoc-members:

.. autoclass:: GetAppPermissionLevelsResponse
:members:
:undoc-members:
Expand Down
3 changes: 2 additions & 1 deletion docs/workspace/catalog/catalogs.rst
Original file line number Diff line number Diff line change
Expand Up @@ -155,12 +155,13 @@
import time

from databricks.sdk import WorkspaceClient
from databricks.sdk.service import catalog

w = WorkspaceClient()

created = w.catalogs.create(name=f"sdk-{time.time_ns()}")

_ = w.catalogs.update(name=created.name, comment="updated")
_ = w.catalogs.update(name=created.name, isolation_mode=catalog.CatalogIsolationMode.ISOLATED)

# cleanup
w.catalogs.delete(name=created.name, force=True)
Expand Down
10 changes: 5 additions & 5 deletions docs/workspace/catalog/external_locations.rst
Original file line number Diff line number Diff line change
Expand Up @@ -107,20 +107,20 @@

credential = w.storage_credentials.create(
name=f"sdk-{time.time_ns()}",
aws_iam_role=catalog.AwsIamRoleRequest(role_arn=os.environ["TEST_METASTORE_DATA_ACCESS_ARN"]),
aws_iam_role=catalog.AwsIamRole(role_arn=os.environ["TEST_METASTORE_DATA_ACCESS_ARN"]),
)

created = w.external_locations.create(
name=f"sdk-{time.time_ns()}",
credential_name=credential.name,
url="s3://%s/%s" % (os.environ["TEST_BUCKET"], f"sdk-{time.time_ns()}"),
url=f's3://{os.environ["TEST_BUCKET"]}/sdk-{time.time_ns()}',
)

_ = w.external_locations.get(name=created.name)
_ = w.external_locations.get(get=created.name)

# cleanup
w.storage_credentials.delete(name=credential.name)
w.external_locations.delete(name=created.name)
w.storage_credentials.delete(delete=credential.name)
w.external_locations.delete(delete=created.name)

Gets an external location from the metastore. The caller must be either a metastore admin, the owner
of the external location, or a user that has some privilege on the external location.
Expand Down
11 changes: 5 additions & 6 deletions docs/workspace/catalog/storage_credentials.rst
Original file line number Diff line number Diff line change
Expand Up @@ -30,14 +30,13 @@

w = WorkspaceClient()

storage_credential = w.storage_credentials.create(
created = w.storage_credentials.create(
name=f"sdk-{time.time_ns()}",
aws_iam_role=catalog.AwsIamRoleRequest(role_arn=os.environ["TEST_METASTORE_DATA_ACCESS_ARN"]),
comment="created via SDK",
)

# cleanup
w.storage_credentials.delete(name=storage_credential.name)
w.storage_credentials.delete(name=created.name)

Creates a new storage credential.

Expand Down Expand Up @@ -99,13 +98,13 @@

created = w.storage_credentials.create(
name=f"sdk-{time.time_ns()}",
aws_iam_role=catalog.AwsIamRole(role_arn=os.environ["TEST_METASTORE_DATA_ACCESS_ARN"]),
aws_iam_role=catalog.AwsIamRoleRequest(role_arn=os.environ["TEST_METASTORE_DATA_ACCESS_ARN"]),
)

by_name = w.storage_credentials.get(get=created.name)
by_name = w.storage_credentials.get(name=created.name)

# cleanup
w.storage_credentials.delete(delete=created.name)
w.storage_credentials.delete(name=created.name)

Gets a storage credential from the metastore. The caller must be a metastore admin, the owner of the
storage credential, or have some permission on the storage credential.
Expand Down
2 changes: 1 addition & 1 deletion docs/workspace/catalog/tables.rst
Original file line number Diff line number Diff line change
Expand Up @@ -156,7 +156,7 @@

created_schema = w.schemas.create(name=f"sdk-{time.time_ns()}", catalog_name=created_catalog.name)

all_tables = w.tables.list(catalog_name=created_catalog.name, schema_name=created_schema.name)
summaries = w.tables.list_summaries(catalog_name=created_catalog.name, schema_name_pattern=created_schema.name)

# cleanup
w.schemas.delete(full_name=created_schema.full_name)
Expand Down
3 changes: 2 additions & 1 deletion docs/workspace/compute/clusters.rst
Original file line number Diff line number Diff line change
Expand Up @@ -645,10 +645,11 @@
.. code-block::

from databricks.sdk import WorkspaceClient
from databricks.sdk.service import compute

w = WorkspaceClient()

nodes = w.clusters.list_node_types()
all = w.clusters.list(compute.ListClustersRequest())

Return information about all pinned and active clusters, and all clusters terminated within the last
30 days. Clusters terminated prior to this period are not included.
Expand Down
2 changes: 1 addition & 1 deletion docs/workspace/iam/current_user.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@

w = WorkspaceClient()

me = w.current_user.me()
me2 = w.current_user.me()

Get details about the current method caller's identity.

Expand Down
Loading
Loading