It looks like you're encountering a 403 Forbidden error while trying to mount your Data Lake Storage in Databricks. This usually indicates a permissions or network access issue.
Here are a few troubleshooting steps that might help you:
Permissions
Make sure the identity Databricks is using (such as a Service Principal or managed identity) has the appropriate RBAC role assigned to the storage account or container, like:
-
Storage Blob Data Reader
(for read access), or -
Storage Blob Data Contributor
(for read/write access)
You can assign these roles through the Azure Portal > Storage Account > Access Control (IAM).
Authentication Configuration
If you're using OAuth with a Service Principal, double-check:
- Client ID, tenant ID, and secret are correct
- The secret hasn’t expired
- The secret is stored correctly in Databricks Secret Scope
- You’re using
abfss://
in your source path for ADLS Gen2
Network Restrictions
If your storage account has firewall rules or private endpoint configurations:
- Ensure the Databricks workspace’s outbound IP addresses are allowed
- If using VNet injection, verify private DNS and route settings
Please refer to the below links for useful insights.
- https://kb.databricks.com/cloud/adls-gen1-mount-problem#:~:text=When%20you%20try%20to%20mount%20the%20same%20account,a%20new%20mount%20with%20a%20new,%20unexpired%20credential.
- https://kb.databricks.com/security/cross-cloud-delta-sharing-query-results-in-403-response#:~:text=The%20403%20error%20and%20AuthorizationFailure,your%20Azure%20storage%20account%27s%20firewall.
I hope this information helps. Please do let us know if you have any further queries.
Kindly consider upvoting the comment if the information provided is helpful. This can assist other community members in resolving similar issues.
Thank you.