Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change shims dependency to spark-3.0.1 #689

Merged
merged 1 commit into from
Sep 9, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions api_validation/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

API validation script checks the compatibility of community Spark Execs and GPU Execs in the Rapids Plugin for Spark.
For example: HashAggregateExec with GpuHashAggregateExec.
Script can be used to audit different versions of Spark(3.0.0, 3.0.1-SNAPSHOT and 3.1.0-SNAPSHOT)
Script can be used to audit different versions of Spark(3.0.0, 3.0.1 and 3.1.0-SNAPSHOT)
The script prints Execs where validation fails.
Validation fails when:
1) The number of parameters differ between community Spark Execs and Gpu Execs.
Expand All @@ -17,7 +17,7 @@ It requires cudf, rapids-4-spark and spark jars.

```
cd api_validation
// To run validation script on all version of Spark(3.0.0, 3.0.1-SNAPSHOT and 3.1.0-SNAPSHOT)
// To run validation script on all version of Spark(3.0.0, 3.0.1 and 3.1.0-SNAPSHOT)
sh auditAllVersions.sh
// To run script on particular version we can use profile(spark300, spark301 and spark310)
Expand Down
2 changes: 1 addition & 1 deletion integration_tests/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@
<profile>
<id>spark301tests</id>
<properties>
<spark.test.version>3.0.1-SNAPSHOT</spark.test.version>
<spark.test.version>3.0.1</spark.test.version>
</properties>
</profile>
<profile>
Expand Down
4 changes: 2 additions & 2 deletions jenkins/Jenkinsfile.301.integration
Original file line number Diff line number Diff line change
Expand Up @@ -51,9 +51,9 @@ pipeline {
}

stages {
stage('IT on 3.0.1-SNAPSHOT') {
stage('IT on 3.0.1') {
agent { label 'docker-gpu' }
environment {SPARK_VER='3.0.1-SNAPSHOT'}
environment {SPARK_VER='3.0.1'}
steps {
script {
def CUDA_NAME=sh(returnStdout: true,
Expand Down
2 changes: 1 addition & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -171,7 +171,7 @@
<!--
If you update a dependendy version so it is no longer a SNAPSHOT
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You missed this comment here right next to the version change. If we want the shim for 3.0.1 to be released you need to change the snapshot-shims profile too. @tgravescs do you want handle this with your docs update, or should I handle it separately?

please update the snapshot-shims profile as well so it is accurate -->
<spark301.version>3.0.1-SNAPSHOT</spark301.version>
<spark301.version>3.0.1</spark301.version>
<spark302.version>3.0.2-SNAPSHOT</spark302.version>
<spark310.version>3.1.0-SNAPSHOT</spark310.version>
</properties>
Expand Down
2 changes: 1 addition & 1 deletion tests/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@
<profile>
<id>spark301tests</id>
<properties>
<spark.test.version>3.0.1-SNAPSHOT</spark.test.version>
<spark.test.version>3.0.1</spark.test.version>
</properties>
</profile>
<profile>
Expand Down