Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Custom Jars - Databricks Docker Cluster #120

Open
aman-solanki-kr opened this issue May 4, 2023 · 3 comments
Open

Custom Jars - Databricks Docker Cluster #120

aman-solanki-kr opened this issue May 4, 2023 · 3 comments

Comments

@aman-solanki-kr
Copy link

BASE IMAGE - databricksruntime/python:10.4-LTS

I successfully installed the Python dependencies, and the tasks that depend on python in the workflow run fine, but I’m struggling to install the Maven and Jar dependencies.

The jar files are in the docker image (databricks/jars) and are visible in the spark environment path when the cluster starts, but when I trigger the workflow, I see a “Java Package not callable error” since the script is unable to use the classes in the jar files.

@aman-solanki-kr
Copy link
Author

@evanye

@evanye
Copy link
Collaborator

evanye commented May 10, 2023

@aman-solanki-kr Try filing a support ticket with your support rep. Unfortunately I don't know the answer to this.

@Nicbyte
Copy link

Nicbyte commented Mar 11, 2024

Just add your jars to /databricks/python3/lib/python3.10/site-packages/pyspark/jars. This is the location for pyspark jars.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants