Accessing storage with shared integration Databricks? #207
-
What is the recommended way to access the raw, enrich/cur, workspace data lakes from the shared integration Databricks? Using a service principal and mount point? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
Hi @baatch, |
Beta Was this translation helpful? Give feedback.
Hi @baatch,
The recommended approach would be to use Credential Passthrough (OAuth 2.0). For automated workflows, you should use a Service Principle and store the secrets (clientId, clientSecret) in a Key Vault backed secret store.
The respective integration and product teams can use their own Key Vault within their RG to store the secrets.