You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Description
The databricks-sdk does not correctly map the collaborator_alias field in the CleanRoomCollaborator class to the required catalog_alias field for the Databricks Clean Room API. This causes the following error when attempting to create a clean room:
Expected behavior
The CleanRoomCollaborator class should map collaborator_alias to the API-required catalog_alias field during serialization.
Is it a regression?
Did this work in a previous version of the SDK? If so, which versions did you try?
No, this is a new feature.
Debug Logs
The SDK logs helpful debugging information when debug logging is enabled. Set the log level to debug by adding logging.basicConfig(level=logging.DEBUG) to your program, and include the logs here.
Not necessary for this I don't believe but can add more information if needed.
Other Information
SDK Version: 0.40.0
Additional context
Workaround:
Subclass the CleanRoomCollaborator class to correctly map the collaborator_alias field to catalog_alias. The workaround implementation looks like this:
class PatchedCleanRoomCollaborator(CleanRoomCollaborator):
def as_dict(self):
"""Override to map 'collaborator_alias' to 'catalog_alias'."""
body = super().as_dict()
if 'collaborator_alias' in body:
body['catalog_alias'] = body['collaborator_alias']
return body
# Updated usage
creator = PatchedCleanRoomCollaborator(
global_metastore_id="aws:us-west-2:creator-metastore-id",
collaborator_alias="creator_catalog"
)
collaborators = [
creator,
PatchedCleanRoomCollaborator(
global_metastore_id="aws:us-east-2:collaborator-metastore-id",
collaborator_alias="collaborator_catalog"
)
]
remote_detail = CleanRoomRemoteDetail(
cloud_vendor="aws",
region="us-west-2",
collaborators=collaborators
)
clean_room = CleanRoom(
owner="[email protected]",
name="test_clean_room",
remote_detailed_info=remote_detail
)
response = client.clean_rooms.create(clean_room=clean_room)
I'm currently using this workaround and confirmed this successfully creates a clean room utilizing the SDK.
Proposed Fix:
Update the SDK to automatically map the collaborator_alias field to catalog_alias during serialization in the CleanRoomCollaborator class.
Ensure SDK tests verify compatibility with the latest API requirements for clean room collaborators.
The text was updated successfully, but these errors were encountered:
Description
The databricks-sdk does not correctly map the collaborator_alias field in the CleanRoomCollaborator class to the required catalog_alias field for the Databricks Clean Room API. This causes the following error when attempting to create a clean room:
This issue prevents the SDK from being used to create clean rooms without manual intervention or subclassing.
Reproduction
Expected behavior
The CleanRoomCollaborator class should map collaborator_alias to the API-required catalog_alias field during serialization.
Is it a regression?
Did this work in a previous version of the SDK? If so, which versions did you try?
No, this is a new feature.
Debug Logs
The SDK logs helpful debugging information when debug logging is enabled. Set the log level to debug by adding
logging.basicConfig(level=logging.DEBUG)
to your program, and include the logs here.Not necessary for this I don't believe but can add more information if needed.
Other Information
Additional context
Workaround:
Subclass the CleanRoomCollaborator class to correctly map the collaborator_alias field to catalog_alias. The workaround implementation looks like this:
I'm currently using this workaround and confirmed this successfully creates a clean room utilizing the SDK.
Proposed Fix:
Update the SDK to automatically map the collaborator_alias field to catalog_alias during serialization in the CleanRoomCollaborator class.
Ensure SDK tests verify compatibility with the latest API requirements for clean room collaborators.
The text was updated successfully, but these errors were encountered: