Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Internal] Extract "before retry" handler, use it to rewind the stream #878

Merged
merged 1 commit into from
Jan 29, 2025

Conversation

ksafonov-db
Copy link
Contributor

@ksafonov-db ksafonov-db commented Jan 29, 2025

What changes are proposed in this pull request?

  • Introduce a separate handler to be called before we retry the API call. This will make sure handler is called both when (1) we receive an error response we want to retry on and (2) when low-level connection exception is thrown.
  • Rewind the stream to the initial position in this handler (if applicable).

How is this tested?

Existing tests.

ALWAYS ANSWER THIS QUESTION: Answer with "N/A" if tests are not applicable
to your PR (e.g. if the PR only modifies comments). Do not be afraid of
answering "Not tested" if the PR has not been tested. Being clear about what
has been done and not done provides important context to the reviewers.

Copy link

If integration tests don't run automatically, an authorized user can run them manually by following the instructions below:

Trigger:
go/deco-tests-run/sdk-py

Inputs:

  • PR number: 878
  • Commit SHA: 8c6d9a6f6066fc0fec517e1f52bde30a969e299f

Checks will be approved automatically on success.

@ksafonov-db ksafonov-db changed the title [Internal] Factor out stream rewind before retry [Internal] Extract "before retry" handler, is it to rewind the stream Jan 29, 2025
@ksafonov-db ksafonov-db changed the title [Internal] Extract "before retry" handler, is it to rewind the stream [Internal] Extract "before retry" handler, use it to rewind the stream Jan 29, 2025
@renaudhartert-db renaudhartert-db self-requested a review January 29, 2025 14:14
@renaudhartert-db renaudhartert-db added this pull request to the merge queue Jan 29, 2025
Merged via the queue into databricks:main with commit 762c57b Jan 29, 2025
22 checks passed
renaudhartert-db added a commit that referenced this pull request Jan 30, 2025
### Bug Fixes

 * Fix docs generation when two services have the same name ([#872](#872)).

### Internal Changes

 * Add CICD environment to the User Agent ([#866](#866)).
 * Add unit tests for retriable requests ([#879](#879)).
 * Extract "before retry" handler, use it to rewind the stream ([#878](#878)).
 * Update Model Serving `http_request` mixin to correctly use the underlying API.  ([#876](#876)).

### API Changes:

 * Added [a.budget_policy](https://databricks-sdk-py.readthedocs.io/en/latest/account/budget_policy.html) account-level service.
 * Added [a.enable_ip_access_lists](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings/enable_ip_access_lists.html) account-level service.
 * Added `review_state`, `reviews` and `runner_collaborators` fields for `databricks.sdk.service.cleanrooms.CleanRoomAssetNotebook`.
 * Added `statement_id` field for `databricks.sdk.service.dashboards.QueryAttachment`.
 * Added `effective_performance_target` field for `databricks.sdk.service.jobs.BaseRun`.
 * Added `performance_target` field for `databricks.sdk.service.jobs.CreateJob`.
 * Added `performance_target` field for `databricks.sdk.service.jobs.JobSettings`.
 * Added `effective_performance_target` field for `databricks.sdk.service.jobs.Run`.
 * Added `performance_target` field for `databricks.sdk.service.jobs.RunNow`.
 * Added `effective_performance_target` field for `databricks.sdk.service.jobs.RunTask`.
 * Added `run_as_repl` field for `databricks.sdk.service.jobs.SparkJarTask`.
 * Added `user_authorized_scopes` field for `databricks.sdk.service.oauth2.CreateCustomAppIntegration`.
 * Added `user_authorized_scopes` field for `databricks.sdk.service.oauth2.GetCustomAppIntegrationOutput`.
 * Added `user_authorized_scopes` field for `databricks.sdk.service.oauth2.UpdateCustomAppIntegration`.
 * Added `contents` field for `databricks.sdk.service.serving.HttpRequestResponse`.
 * Added .
 * Added .
 * Added .
 * Changed `create()` method for [w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html) workspace-level service with new required argument order.
 * Changed `http_request()` method for [w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html) workspace-level service to type `http_request()` method for [w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html) workspace-level service.
 * Changed `http_request()` method for [w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html) workspace-level service to return `databricks.sdk.service.serving.HttpRequestResponse` dataclass.
 * Changed `config` field for `databricks.sdk.service.serving.CreateServingEndpoint` to no longer be required.
 * Removed `securable_kind` field for `databricks.sdk.service.catalog.CatalogInfo`.
 * Removed `securable_kind` field for `databricks.sdk.service.catalog.ConnectionInfo`.
 * Removed `status_code` and `text` fields for `databricks.sdk.service.serving.ExternalFunctionResponse`.

OpenAPI SHA: 840c660106f820a1a5dff931d51fa5f65cd9fdd9, Date: 2025-01-28
github-merge-queue bot pushed a commit that referenced this pull request Jan 30, 2025
### Bug Fixes

* Fix docs generation when two services have the same name
([#872](#872)).


### Internal Changes

* Add CICD environment to the User Agent
([#866](#866)).
* Add unit tests for retriable requests
([#879](#879)).
* Extract "before retry" handler, use it to rewind the stream
([#878](#878)).
* Update Model Serving `http_request` mixin to correctly use the
underlying API.
([#876](#876)).

### Backward Incompatible Changes

* Changed `create()` method for
[w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving/serving_endpoints.html)
workspace-level service with new required argument order.
* Changed `http_request()` method for
[w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving/serving_endpoints.html)
workspace-level service to type `http_request()` method for
[w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving/serving_endpoints.html)
workspace-level service.
* Changed `http_request()` method for
[w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving/serving_endpoints.html)
workspace-level service to return
`databricks.sdk.service.serving.HttpRequestResponse` dataclass.
* Changed `config` field for
`databricks.sdk.service.serving.CreateServingEndpoint` to no longer be
required.
* Removed `securable_kind` field for
`databricks.sdk.service.catalog.CatalogInfo`.
* Removed `securable_kind` field for
`databricks.sdk.service.catalog.ConnectionInfo`.
* Removed `status_code` and `text` fields for
`databricks.sdk.service.serving.ExternalFunctionResponse`.

### API Changes:

* Added
[a.budget_policy](https://databricks-sdk-py.readthedocs.io/en/latest/account/billing/budget_policy.html)
account-level service.
* Added
[a.enable_ip_access_lists](https://databricks-sdk-py.readthedocs.io/en/latest/account/settings/settings/enable_ip_access_lists.html)
account-level service.
* Added `review_state`, `reviews` and `runner_collaborators` fields for
`databricks.sdk.service.cleanrooms.CleanRoomAssetNotebook`.
* Added `statement_id` field for
`databricks.sdk.service.dashboards.QueryAttachment`.
* Added `effective_performance_target` field for
`databricks.sdk.service.jobs.BaseRun`.
* Added `performance_target` field for
`databricks.sdk.service.jobs.CreateJob`.
* Added `performance_target` field for
`databricks.sdk.service.jobs.JobSettings`.
* Added `effective_performance_target` field for
`databricks.sdk.service.jobs.Run`.
* Added `performance_target` field for
`databricks.sdk.service.jobs.RunNow`.
* Added `effective_performance_target` field for
`databricks.sdk.service.jobs.RunTask`.
* Added `run_as_repl` field for
`databricks.sdk.service.jobs.SparkJarTask`.
* Added `user_authorized_scopes` field for
`databricks.sdk.service.oauth2.CreateCustomAppIntegration`.
* Added `user_authorized_scopes` field for
`databricks.sdk.service.oauth2.GetCustomAppIntegrationOutput`.
* Added `user_authorized_scopes` field for
`databricks.sdk.service.oauth2.UpdateCustomAppIntegration`.
* Added `contents` field for
`databricks.sdk.service.serving.HttpRequestResponse`.
* Added `clean_room` enum value for
`databricks.sdk.service.catalog.SecurableType`.
* Added `budget_policy_limit_exceeded` enum value for
`databricks.sdk.service.jobs.TerminationCodeCode`.
* Added `arclight_azure_exchange_token_with_user_delegation_key` enum
value for `databricks.sdk.service.settings.TokenType`.

OpenAPI SHA: 840c660106f820a1a5dff931d51fa5f65cd9fdd9, Date: 2025-01-28

---------

Signed-off-by: Renaud Hartert <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants