Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Spark 4.0.0 Build Profile and Other Supporting Changes [databricks] #10994

Merged
merged 54 commits into from
Jul 16, 2024
Merged
Show file tree
Hide file tree
Changes from 6 commits
Commits
Show all changes
54 commits
Select commit Hold shift + click to select a range
0c3a1ba
POM changes for Spark 4.0.0
razajafri Jun 5, 2024
1631ab4
validate buildver and scala versions
razajafri Jun 7, 2024
3271bfd
more pom changes
razajafri Jun 7, 2024
5d2b867
fixed the scala-2.12 comment
razajafri Jun 7, 2024
58806bd
more fixes for scala-2.13 pom
razajafri Jun 7, 2024
540c732
addressed comments
razajafri Jun 7, 2024
5faecf8
add in shim check to account for 400
razajafri Jun 14, 2024
5de0fff
add 400 for premerge tests against jdk 17
razajafri Jun 25, 2024
bb784f0
temporarily remove 400 from snapshotScala213
razajafri Jun 25, 2024
140eb7b
Merge remote-tracking branch 'origin/branch-24.08' into HEAD
razajafri Jun 25, 2024
41e1982
fixed 2.13 pom
razajafri Jun 25, 2024
85a2f6d
Remove 400 from jdk17 as it will compile with Scala 2.12
razajafri Jun 25, 2024
f7f5a98
github workflow changes
razajafri Jun 25, 2024
74dd568
added quotes to pom-directory
razajafri Jun 25, 2024
bd1bc70
update version defs to include scala 213 jdk 17
razajafri Jun 25, 2024
46eb751
Merge remote-tracking branch 'origin/branch-24.08' into HEAD
razajafri Jun 27, 2024
2b15ab2
Cross-compile all shims from JDK17 to JDK8
gerashegalov Jun 27, 2024
f7f8edf
dummy
gerashegalov Jun 27, 2024
ac0ecba
undo api pom change
gerashegalov Jun 27, 2024
c644ce8
Add preview1 to the allowed shim versions
gerashegalov Jun 27, 2024
9d182b3
Scala 2.13 to require JDK17
gerashegalov Jun 28, 2024
96e0843
Merge pull request #3 from gerashegalov/spark400crosscompile
razajafri Jun 28, 2024
b51c08b
Removed unused import left over from https://github.com/razajafri/spa…
razajafri Jun 28, 2024
1b9beb5
Setup JAVA_HOME before caching
razajafri Jun 30, 2024
a173f35
Only upgrade the Scala plugin for Scala 2.13
razajafri Jul 1, 2024
6138cc8
Regenerate Scala 2.13 poms
razajafri Jul 1, 2024
1faabd4
Remove 330 from JDK17 builds for Scala 2.12
razajafri Jul 1, 2024
0cf0036
Revert "Remove 330 from JDK17 builds for Scala 2.12"
razajafri Jul 1, 2024
a7b42c6
Downgrade scala.plugin.version for cloudera
razajafri Jul 1, 2024
8d0f8ca
Updated comment to include the issue
razajafri Jul 1, 2024
45b0d57
Upgrading the scala.maven.plugin version to 4.9.1 which is the same a…
razajafri Jul 3, 2024
eb09f98
Downgrade scala-maven-plugin for Cloudera
razajafri Jul 3, 2024
4407dbf
revert mvn verify changes
razajafri Jul 3, 2024
0e4e45a
Avoid cache for JDK 17
razajafri Jul 3, 2024
bd10267
Handle the change for UnaryPositive now extending RuntimeReplaceable
razajafri Jul 3, 2024
2982a59
Removing 330 from jdk17.buildvers as we only support Scala2.13 and fi…
razajafri Jul 3, 2024
319aefc
Update Scala 2.13 poms
razajafri Jul 3, 2024
835fbb4
Merge remote-tracking branch 'origin/branch-24.08' into HEAD
razajafri Jul 5, 2024
dfbb149
fixed scala2.13 verify to actually use the scala2.13/pom.xml
razajafri Jul 8, 2024
8ffc4f1
Added missing csv files
razajafri Jul 8, 2024
f599413
Merge remote-tracking branch 'origin/branch-24.08' into HEAD
razajafri Jul 9, 2024
0544c6f
Skip Opcode tests
razajafri Jul 12, 2024
a43a68d
Merge remote-tracking branch 'origin/branch-24.08' into HEAD
razajafri Jul 12, 2024
2cf7351
upmerged and fixed the new compile error introduced
razajafri Jul 12, 2024
cabcda0
addressed review comments
razajafri Jul 12, 2024
c19fccf
Merge remote-tracking branch 'origin/branch-24.08' into HEAD
razajafri Jul 12, 2024
feeabfc
Removed jdk17 cloudera check and moved it inside the 321,330 and 332 …
razajafri Jul 12, 2024
54a3ee4
fixed upmerge conflicts
razajafri Jul 12, 2024
48fe626
reverted renaming of id
razajafri Jul 12, 2024
166e4c6
Merge remote-tracking branch 'origin/branch-24.08' into SP-9259-POM-c…
razajafri Jul 15, 2024
6cbce78
Fixed HiveGenericUDFShim
razajafri Jul 15, 2024
8d0351b
addressed review comments
razajafri Jul 15, 2024
5b0e36d
reverted the debugging code
razajafri Jul 15, 2024
66419b8
generated Scala 2.13 poms
razajafri Jul 15, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 19 additions & 0 deletions aggregator/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -779,5 +779,24 @@
</dependency>
</dependencies>
</profile>
<!-- #if scala-2.13 --><!--
<profile>
<id>release400</id>
<activation>
<property>
<name>buildver</name>
<value>400</value>
</property>
</activation>
<dependencies>
<dependency>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-delta-stub_${scala.binary.version}</artifactId>
<version>${project.version}</version>
<classifier>${spark.version.classifier}</classifier>
</dependency>
</dependencies>
</profile>
--><!-- #endif scala-2.13 -->
</profiles>
</project>
3 changes: 1 addition & 2 deletions build/buildall
Original file line number Diff line number Diff line change
Expand Up @@ -161,7 +161,6 @@ if [[ "$DIST_PROFILE" == *Scala213 ]]; then
SCALA213=1
fi


# include options to mvn command
export MVN="mvn -Dmaven.wagon.http.retryHandler.count=3 ${MVN_OPT}"

Expand Down Expand Up @@ -196,7 +195,7 @@ case $DIST_PROFILE in
SPARK_SHIM_VERSIONS=($(versionsFromDistProfile "minimumFeatureVersionMix"))
;;

3*)
[34]*)
<<< $DIST_PROFILE IFS="," read -ra SPARK_SHIM_VERSIONS
INCLUDED_BUILDVERS_OPT="-Dincluded_buildvers=$DIST_PROFILE"
unset DIST_PROFILE
Expand Down
70 changes: 66 additions & 4 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -611,6 +611,50 @@
<module>delta-lake/delta-stub</module>
</modules>
</profile>
<!-- #if scala-2.13 --><!--
<profile>
<id>release400</id>
<activation>
<property>
<name>buildver</name>
<value>400</value>
</property>
</activation>
<properties>
<buildver>400</buildver>
<spark.version>${spark400.version}</spark.version>
<spark.test.version>${spark400.version}</spark.test.version>
<parquet.hadoop.version>1.13.1</parquet.hadoop.version>
<iceberg.version>${spark330.iceberg.version}</iceberg.version>
<slf4j.version>2.0.7</slf4j.version>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId>
<executions>
<execution>
<id>enforce-java-version</id>
<goals><goal>enforce</goal></goals>
<configuration>
<rules>
<requireJavaVersion>
<message>Support for Spark 4.0.0 is only available with Java 17+</message>
<version>[17,)</version>
</requireJavaVersion>
</rules>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
razajafri marked this conversation as resolved.
Show resolved Hide resolved
<modules>
<module>delta-lake/delta-stub</module>
</modules>
</profile>
--><!-- #endif scala-2.13 -->
<profile>
<id>source-javadoc</id>
<build>
Expand Down Expand Up @@ -642,7 +686,7 @@
<id>scala-2.13</id>
<properties>
<scala.binary.version>2.13</scala.binary.version>
<scala.version>2.13.8</scala.version>
<scala.version>2.13.13</scala.version>
</properties>
</profile>
<profile>
Expand Down Expand Up @@ -782,6 +826,7 @@
<spark341db.version>3.4.1-databricks</spark341db.version>
<spark350.version>3.5.0</spark350.version>
<spark351.version>3.5.1</spark351.version>
<spark400.version>4.0.0-SNAPSHOT</spark400.version>
<mockito.version>3.12.4</mockito.version>
<scala.plugin.version>4.3.0</scala.plugin.version>
<maven.install.plugin.version>3.1.1</maven.install.plugin.version>
Expand Down Expand Up @@ -835,6 +880,9 @@
351
</noSnapshot.buildvers>
<snapshot.buildvers>
<!-- #if scala-2.13 --><!--
400
--><!-- #endif scala-2.13 -->
razajafri marked this conversation as resolved.
Show resolved Hide resolved
</snapshot.buildvers>
<databricks.buildvers>
330db,
Expand Down Expand Up @@ -890,6 +938,7 @@
351
</noSnapshotScala213.buildvers>
<snapshotScala213.buildvers>
400
razajafri marked this conversation as resolved.
Show resolved Hide resolved
</snapshotScala213.buildvers>
<allScala213.buildvers>
${noSnapshotScala213.buildvers}
Expand Down Expand Up @@ -1228,13 +1277,19 @@ This will force full Scala code rebuild in downstream modules.
<groupId>net.sourceforge.pmd</groupId>
<artifactId>pmd-dist</artifactId>
<version>6.55.0</version>
<exclusions>
<exclusion>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
</exclusion>
</exclusions>
</dependency>
</dependencies>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.3.0</version>
<version>3.6.0</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
Expand Down Expand Up @@ -1481,7 +1536,7 @@ This will force full Scala code rebuild in downstream modules.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId>
<version>3.3.0</version>
<version>3.5.0</version>
<executions>
<execution>
<id>enforce-maven</id>
Expand All @@ -1498,11 +1553,18 @@ This will force full Scala code rebuild in downstream modules.
<message>Only Java 8, 11, and 17 are supported!</message>
<version>[1.8,1.9),[11,12),[17,18)</version>
</requireJavaVersion>
<!-- #if scala-2.12 -->
<requireProperty>
<property>buildver</property>
<regex>^(?!400).*$</regex>
<regexMessage>Spark 4.0.0 is only supported for Scala 2.13</regexMessage>
</requireProperty>
<!-- #endif scala-2.12 -->
<!-- #if scala-2.13 --><!--
<requireProperty>
<regexMessage>Unexpected buildver value ${buildver} for a Scala 2.13 build, only Apache Spark versions 3.3.0 (330) and higher are supported, no vendor builds such as 330db</regexMessage>
<property>buildver</property>
<regex>[3-9][3-9][0-9]</regex>
<regex>(?:[3-9][3-9]|[4-9][0-9])[0-9]</regex>
</requireProperty>
--><!-- #endif scala-2.13 -->
</rules>
Expand Down
19 changes: 19 additions & 0 deletions scala2.13/aggregator/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -779,5 +779,24 @@
</dependency>
</dependencies>
</profile>
<!-- #if scala-2.13 -->
<profile>
<id>release400</id>
<activation>
<property>
<name>buildver</name>
<value>400</value>
</property>
</activation>
<dependencies>
<dependency>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-delta-stub_${scala.binary.version}</artifactId>
<version>${project.version}</version>
<classifier>${spark.version.classifier}</classifier>
</dependency>
</dependencies>
</profile>
<!-- #endif scala-2.13 -->
</profiles>
</project>
72 changes: 67 additions & 5 deletions scala2.13/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -611,6 +611,50 @@
<module>delta-lake/delta-stub</module>
</modules>
</profile>
<!-- #if scala-2.13 -->
<profile>
<id>release400</id>
<activation>
<property>
<name>buildver</name>
<value>400</value>
</property>
</activation>
<properties>
<buildver>400</buildver>
<spark.version>${spark400.version}</spark.version>
<spark.test.version>${spark400.version}</spark.test.version>
<parquet.hadoop.version>1.13.1</parquet.hadoop.version>
<iceberg.version>${spark330.iceberg.version}</iceberg.version>
<slf4j.version>2.0.7</slf4j.version>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId>
<executions>
<execution>
<id>enforce-java-version</id>
<goals><goal>enforce</goal></goals>
<configuration>
<rules>
<requireJavaVersion>
<message>Support for Spark 4.0.0 is only available with Java 17+</message>
<version>[17,)</version>
</requireJavaVersion>
</rules>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
<modules>
<module>delta-lake/delta-stub</module>
</modules>
</profile>
<!-- #endif scala-2.13 -->
<profile>
<id>source-javadoc</id>
<build>
Expand Down Expand Up @@ -642,7 +686,7 @@
<id>scala-2.13</id>
<properties>
<scala.binary.version>2.13</scala.binary.version>
<scala.version>2.13.8</scala.version>
<scala.version>2.13.13</scala.version>
</properties>
</profile>
<profile>
Expand Down Expand Up @@ -724,7 +768,7 @@
<scala.binary.version>2.13</scala.binary.version>
<alluxio.client.version>2.8.0</alluxio.client.version>
<scala.recompileMode>incremental</scala.recompileMode>
<scala.version>2.13.8</scala.version>
<scala.version>2.13.13</scala.version>
<!--
-processing
to suppress unactionable "No processor claimed any of these annotations"
Expand Down Expand Up @@ -782,6 +826,7 @@
<spark341db.version>3.4.1-databricks</spark341db.version>
<spark350.version>3.5.0</spark350.version>
<spark351.version>3.5.1</spark351.version>
<spark400.version>4.0.0-SNAPSHOT</spark400.version>
<mockito.version>3.12.4</mockito.version>
<scala.plugin.version>4.3.0</scala.plugin.version>
<maven.install.plugin.version>3.1.1</maven.install.plugin.version>
Expand Down Expand Up @@ -835,6 +880,9 @@
351
</noSnapshot.buildvers>
<snapshot.buildvers>
<!-- #if scala-2.13 -->
400
<!-- #endif scala-2.13 -->
razajafri marked this conversation as resolved.
Show resolved Hide resolved
</snapshot.buildvers>
<databricks.buildvers>
330db,
Expand Down Expand Up @@ -890,6 +938,7 @@
351
</noSnapshotScala213.buildvers>
<snapshotScala213.buildvers>
400
</snapshotScala213.buildvers>
<allScala213.buildvers>
${noSnapshotScala213.buildvers}
Expand Down Expand Up @@ -1228,13 +1277,19 @@ This will force full Scala code rebuild in downstream modules.
<groupId>net.sourceforge.pmd</groupId>
<artifactId>pmd-dist</artifactId>
<version>6.55.0</version>
<exclusions>
<exclusion>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
</exclusion>
</exclusions>
</dependency>
</dependencies>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.3.0</version>
<version>3.6.0</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
Expand Down Expand Up @@ -1481,7 +1536,7 @@ This will force full Scala code rebuild in downstream modules.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId>
<version>3.3.0</version>
<version>3.5.0</version>
<executions>
<execution>
<id>enforce-maven</id>
Expand All @@ -1498,11 +1553,18 @@ This will force full Scala code rebuild in downstream modules.
<message>Only Java 8, 11, and 17 are supported!</message>
<version>[1.8,1.9),[11,12),[17,18)</version>
</requireJavaVersion>
<!-- #if scala-2.12 --><!--
<requireProperty>
<property>buildver</property>
<regex>^(?!400).*$</regex>
<regexMessage>Spark 4.0.0 is only supported for Scala 2.13</regexMessage>
</requireProperty>
--><!-- #endif scala-2.12 -->
<!-- #if scala-2.13 -->
<requireProperty>
<regexMessage>Unexpected buildver value ${buildver} for a Scala 2.13 build, only Apache Spark versions 3.3.0 (330) and higher are supported, no vendor builds such as 330db</regexMessage>
<property>buildver</property>
<regex>[3-9][3-9][0-9]</regex>
<regex>(?:[3-9][3-9]|[4-9][0-9])[0-9]</regex>
</requireProperty>
<!-- #endif scala-2.13 -->
</rules>
Expand Down
13 changes: 13 additions & 0 deletions scala2.13/sql-plugin/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -103,6 +103,19 @@
<scope>test</scope>
</dependency>
</dependencies>
<!-- #if scala-2.13 -->
<profiles>
<profile>
<id>release400</id>
<activation>
<property>
<name>buildver</name>
<value>400</value>
</property>
</activation>
</profile>
</profiles>
<!-- #endif scala-2.13 -->
<build>
<resources>
<resource>
Expand Down
13 changes: 13 additions & 0 deletions sql-plugin/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -103,6 +103,19 @@
<scope>test</scope>
</dependency>
</dependencies>
<!-- #if scala-2.13 --><!--
<profiles>
<profile>
<id>release400</id>
<activation>
<property>
<name>buildver</name>
<value>400</value>
</property>
</activation>
</profile>
</profiles>
--><!-- #endif scala-2.13 -->
<build>
<resources>
<resource>
Expand Down
Loading