Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Smoke test with '--package' to fetch the plugin jar #10238

Merged
merged 3 commits into from
Jan 23, 2024

Conversation

NvTimLiu
Copy link
Collaborator

To fix #10160

Run SPARK_SHELL_SMOKE_TEST with '--package' to fetch the plugin jar

Can downloading the plugin jar from specified artifact repo by '--repositories'

This test can fetch plugin jar from internal maven repo, maven central, or Sonatype staging repo.

Signed-off-by: Tim Liu timl@nvidia.com

To fix NVIDIA#10160

Run SPARK_SHELL_SMOKE_TEST with '--package' to fetch the plugin jar

Can downloading the plugin jar from specified artifact repo by '--repositories'

This test can fetch plugin jar from internal maven repo, maven central, or Sonatype staging repo.

Signed-off-by: Tim Liu <timl@nvidia.com>
@NvTimLiu NvTimLiu added bug Something isn't working build Related to CI / CD or cleanly building labels Jan 22, 2024
@NvTimLiu NvTimLiu self-assigned this Jan 22, 2024
NvTimLiu and others added 2 commits January 23, 2024 10:02
This suggestion is reasonable

Co-authored-by: Jason Lowe <jlowe@nvidia.com>
Signed-off-by: Tim Liu <timl@nvidia.com>
@NvTimLiu
Copy link
Collaborator Author

build

@NvTimLiu
Copy link
Collaborator Author

NvTimLiu commented Jan 23, 2024

CI failed due to java.net.BindException: Failed to bind to /0.0.0.0:4040: Service 'SparkUI' for unknown reason, try to rebuild it

@NvTimLiu
Copy link
Collaborator Author

build

@NvTimLiu NvTimLiu merged commit 5b4c575 into NVIDIA:branch-24.02 Jan 23, 2024
40 checks passed
@@ -304,6 +304,11 @@ if [[ $TEST_MODE == "DEFAULT" ]]; then
PYSP_TEST_spark_shuffle_manager=com.nvidia.spark.rapids.${SHUFFLE_SPARK_SHIM}.RapidsShuffleManager \
./run_pyspark_from_build.sh

SPARK_SHELL_SMOKE_TEST=1 \
Copy link
Collaborator Author

@NvTimLiu NvTimLiu Jan 26, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is a limitation in Spark that spark currently does not support classifiers with --packages, https://issues.apache.org/jira/browse/SPARK-20075

It only works on the default cuda11 jar, Can not work on cuda12 and arm64 classifiers

#10160 (comment)

@gerashegalov
Copy link
Collaborator

CI failed due to java.net.BindException: Failed to bind to /0.0.0.0:4040: Service 'SparkUI' for unknown reason, try to rebuild it

all tests should eventually disable UI #10227

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working build Related to CI / CD or cleanly building
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add tests that use Spark's --packages to fetch the plugin
3 participants