Skip to content

Commit

Permalink
Run '--packages' only with default cuda11 jar (#10279)
Browse files Browse the repository at this point in the history
* Run '--packages' only with default cuda11 jar

As '--packages' only works on the default cuda11 jar, it does not support classifier parameter,
refer to issue: https://issues.apache.org/jira/browse/SPARK-20075

We can not specify classifier jar to run plugin tests with '--packages', see below error log:
    Exception in thread "main" java.lang.IllegalArgumentException:
        requirement failed: Provided Maven Coordinates must be in the form 'groupId:artifactId:version'.
        The coordinate provided is: com.nvidia:rapids-4-spark_2.12:23.12.0:jar:cuda12

Signed-off-by: Tim Liu <timl@nvidia.com>

* Add description why need run packages test with ==cuda11 case

---------

Signed-off-by: Tim Liu <timl@nvidia.com>
  • Loading branch information
NvTimLiu authored Jan 30, 2024
1 parent ad6fde9 commit e375368
Showing 1 changed file with 11 additions and 4 deletions.
15 changes: 11 additions & 4 deletions jenkins/spark-tests.sh
Original file line number Diff line number Diff line change
Expand Up @@ -304,10 +304,17 @@ if [[ $TEST_MODE == "DEFAULT" ]]; then
PYSP_TEST_spark_shuffle_manager=com.nvidia.spark.rapids.${SHUFFLE_SPARK_SHIM}.RapidsShuffleManager \
./run_pyspark_from_build.sh

SPARK_SHELL_SMOKE_TEST=1 \
PYSP_TEST_spark_jars_packages=com.nvidia:rapids-4-spark_${SCALA_BINARY_VER}:${PROJECT_VER} \
PYSP_TEST_spark_jars_repositories=${PROJECT_REPO} \
./run_pyspark_from_build.sh
# As '--packages' only works on the default cuda11 jar, it does not support classifiers
# refer to issue : https://issues.apache.org/jira/browse/SPARK-20075
# "$CLASSIFIER" == ''" is usally for the case running by developers,
# while "$CLASSIFIER" == "cuda11" is for the case running on CI.
# We expect to run packages test for both cases
if [[ "$CLASSIFIER" == "" || "$CLASSIFIER" == "cuda11" ]]; then
SPARK_SHELL_SMOKE_TEST=1 \
PYSP_TEST_spark_jars_packages=com.nvidia:rapids-4-spark_${SCALA_BINARY_VER}:${PROJECT_VER} \
PYSP_TEST_spark_jars_repositories=${PROJECT_REPO} \
./run_pyspark_from_build.sh
if

# ParquetCachedBatchSerializer cache_test
PYSP_TEST_spark_sql_cache_serializer=com.nvidia.spark.ParquetCachedBatchSerializer \
Expand Down

0 comments on commit e375368

Please sign in to comment.