-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bundle run fails for non unity-catalog workspace #1000
Comments
@Eulenator what type of resources are you deploying and trying to run? |
@andrewnester i have a spark-python project with the necessary entry points for my data pipelines. I build a wheel file and use a deployment.yaml (under /resources) like this to push it with asset bundles to databricks:
My databricks.yaml for the bundle then looks like this:
I already experimented with root_path or artifact_path to change the path to a dbfs one. So the cluster can access the files without unity catalog. But no luck so far. |
@Eulenator thanks! Since you're using Python wheel tasks, you can enable the following
It will allow to install wheel tasks on non UC clusters. |
Thanks @andrewnester this fixes one problem. The other problem is that we want to trigger the pipeline with the jobs API with dynamic --python-params like this:
But it looks like that the wrapping notebook to install the libraries overwrites the sys.argv arguments with the ones defined in the deployment.yaml:
Is there a chance to run the job with the cli client and still configure dynamic params? |
Sorry for the delay in the reply. At the moment it's not possible to do so, but with when this PR lands #1037 you could use |
…heel_wrapper` is true (#1037) ## Changes It makes the behaviour consistent with or without `python_wheel_wrapper` on when job is run with `--python-params` flag. In `python_wheel_wrapper` mode it converts dynamic `python_params` in a dynamic specially named `notebook_param` and the wrapper reads them with `dbutils` and pass to `sys.argv` Fixes #1000 ## Tests Added an integration test. Integration tests pass.
The fix has been merged and will be released in the upcoming release next week. In the meantime you can try out snapshot version which already contains the fix https://github.com/databricks/cli/releases/tag/snapshot |
I have deployed my pipelines with databricks bundle deploy, this worked without problems.
Now when i want to trigger a pipeline databricks bundle run then it fails with the following error message:
Library installation failed for library due to user error. Error messages: Library from /Workspace is not allowed on non Unity Catalog cluster. Please switch to DBR 13.1+ Shared cluster or 13.2+ Assigned cluster to use /Workspace libraries.
My problem is that we do not plan to use unity catalog soon.
So is there an option I have missed to use databricks asset bundles with non unity catalog workspaces?
The text was updated successfully, but these errors were encountered: