Skip to content

Commit

Permalink
Update spark322shim dependency to released lib (#6031)
Browse files Browse the repository at this point in the history
* Update spark322shim dependency to released one

Signed-off-by: Peixin Li <pxli@nyu.edu>

* address comment

* also update minimumFeatureVersionMix doc
  • Loading branch information
pxLi authored Jul 21, 2022
1 parent e006493 commit b422d50
Show file tree
Hide file tree
Showing 7 changed files with 9 additions and 6 deletions.
4 changes: 2 additions & 2 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -86,8 +86,8 @@ There is a build script `build/buildall` that automates the local build process.
`./buid/buildall --help` for up-to-date use information.

By default, it builds everything that is needed to create a distribution jar for all released (noSnapshots) Spark versions except for Databricks. Other profiles that you can pass using `--profile=<distribution profile>` include
- `snapshots`
- `minimumFeatureVersionMix` that currently includes 321cdh, 312, 320 is recommended for catching incompatibilities already in the local development cycle
- `snapshots` that includes all released (noSnapshots) and snapshots Spark versions except for Databricks
- `minimumFeatureVersionMix` that currently includes 321cdh, 312, 320, 330 is recommended for catching incompatibilities already in the local development cycle

For initial quick iterations we can use `--profile=<buildver>` to build a single-shim version. e.g., `--profile=311` for Spark 3.1.1.

Expand Down
2 changes: 2 additions & 0 deletions build/buildall
Original file line number Diff line number Diff line change
Expand Up @@ -159,6 +159,7 @@ case $DIST_PROFILE in
320
321
322
330
331
)
;;
Expand All @@ -171,6 +172,7 @@ case $DIST_PROFILE in
313
320
321
322
330
)
;;
Expand Down
2 changes: 1 addition & 1 deletion dist/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -47,11 +47,11 @@
320,
321,
321cdh,
322,
330
</noSnapshot.buildvers>
<snapshot.buildvers>
314,
322,
331
</snapshot.buildvers>
<databricks.buildvers>
Expand Down
1 change: 1 addition & 0 deletions docs/additional-functionality/rapids-shuffle.md
Original file line number Diff line number Diff line change
Expand Up @@ -286,6 +286,7 @@ In this section, we are using a docker container built using the sample dockerfi
| 3.2.0 | com.nvidia.spark.rapids.spark320.RapidsShuffleManager |
| 3.2.1 | com.nvidia.spark.rapids.spark321.RapidsShuffleManager |
| 3.2.1 CDH | com.nvidia.spark.rapids.spark321cdh.RapidsShuffleManager |
| 3.2.2 | com.nvidia.spark.rapids.spark322.RapidsShuffleManager |
| 3.3.0 | com.nvidia.spark.rapids.spark330.RapidsShuffleManager |
| Databricks 9.1 | com.nvidia.spark.rapids.spark312db.RapidsShuffleManager |
| Databricks 10.4 | com.nvidia.spark.rapids.spark321db.RapidsShuffleManager |
Expand Down
2 changes: 1 addition & 1 deletion jenkins/spark-premerge-build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ mvn_verify() {
# enable UTF-8 for regular expression tests
env -u SPARK_HOME LC_ALL="en_US.UTF-8" mvn $MVN_URM_MIRROR -Dbuildver=320 test -Drat.skip=true -Dmaven.javadoc.skip=true -Dskip -Dmaven.scalastyle.skip=true -Dcuda.version=$CUDA_CLASSIFIER -Dpytest.TEST_TAGS='' -pl '!tools' -DwildcardSuites=com.nvidia.spark.rapids.ConditionalsSuite,com.nvidia.spark.rapids.RegularExpressionSuite,com.nvidia.spark.rapids.RegularExpressionTranspilerSuite
env -u SPARK_HOME mvn -U -B $MVN_URM_MIRROR -Dbuildver=321 clean install -Drat.skip=true -DskipTests -Dmaven.javadoc.skip=true -Dskip -Dmaven.scalastyle.skip=true -Dcuda.version=$CUDA_CLASSIFIER -pl aggregator -am
[[ $BUILD_MAINTENANCE_VERSION_SNAPSHOTS == "true" ]] && env -u SPARK_HOME mvn -U -B $MVN_URM_MIRROR -Dbuildver=322 clean install -Drat.skip=true -DskipTests -Dmaven.javadoc.skip=true -Dskip -Dmaven.scalastyle.skip=true -Dcuda.version=$CUDA_CLASSIFIER -pl aggregator -am
env -u SPARK_HOME mvn -U -B $MVN_URM_MIRROR -Dbuildver=322 clean install -Drat.skip=true -DskipTests -Dmaven.javadoc.skip=true -Dskip -Dmaven.scalastyle.skip=true -Dcuda.version=$CUDA_CLASSIFIER -pl aggregator -am
env -u SPARK_HOME mvn -U -B $MVN_URM_MIRROR -Dbuildver=330 clean install -Drat.skip=true -DskipTests -Dmaven.javadoc.skip=true -Dskip -Dmaven.scalastyle.skip=true -Dcuda.version=$CUDA_CLASSIFIER -pl aggregator -am
[[ $BUILD_MAINTENANCE_VERSION_SNAPSHOTS == "true" ]] && env -u SPARK_HOME mvn -U -B $MVN_URM_MIRROR -Dbuildver=331 clean install -Drat.skip=true -DskipTests -Dmaven.javadoc.skip=true -Dskip -Dmaven.scalastyle.skip=true -Dcuda.version=$CUDA_CLASSIFIER -pl aggregator -am
[[ $BUILD_FEATURE_VERSION_SNAPSHOTS == "true" ]] && env -u SPARK_HOME mvn -U -B $MVN_URM_MIRROR -Dbuildver=340 clean install -Drat.skip=true -DskipTests -Dmaven.javadoc.skip=true -Dskip -Dmaven.scalastyle.skip=true -Dcuda.version=$CUDA_CLASSIFIER -pl aggregator -am
Expand Down
2 changes: 1 addition & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -1008,7 +1008,7 @@
<spark321.version>3.2.1</spark321.version>
<spark321cdh.version>3.2.1.3.2.7171000.0-3</spark321cdh.version>
<spark321db.version>3.2.1-databricks</spark321db.version>
<spark322.version>3.2.2-SNAPSHOT</spark322.version>
<spark322.version>3.2.2</spark322.version>
<spark330.version>3.3.0</spark330.version>
<spark331.version>3.3.1-SNAPSHOT</spark331.version>
<spark340.version>3.4.0-SNAPSHOT</spark340.version>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ import com.nvidia.spark.rapids.SparkShimVersion

object SparkShimServiceProvider {
val VERSION = SparkShimVersion(3, 2, 2)
val VERSIONNAMES = Seq(s"$VERSION", s"$VERSION-SNAPSHOT")
val VERSIONNAMES = Seq(s"$VERSION")
}

class SparkShimServiceProvider extends com.nvidia.spark.rapids.SparkShimServiceProvider {
Expand Down

0 comments on commit b422d50

Please sign in to comment.