Skip to content

Commit

Permalink
Update a few docs for removal of 311cdh (#5410)
Browse files Browse the repository at this point in the history
Signed-off-by: Thomas Graves <tgraves@nvidia.com>
  • Loading branch information
tgravescs authored May 3, 2022
1 parent e27f681 commit 7330393
Show file tree
Hide file tree
Showing 3 changed files with 5 additions and 5 deletions.
6 changes: 3 additions & 3 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ mvn -Dbuildver=312 install -Drat.skip=true -DskipTests
mvn -Dbuildver=313 install -Drat.skip=true -DskipTests
mvn -Dbuildver=320 install -Drat.skip=true -DskipTests
mvn -Dbuildver=321 install -Drat.skip=true -DskipTests
mvn -Dbuildver=311cdh install -Drat.skip=true -DskipTests
mvn -Dbuildver=321cdh install -Drat.skip=true -DskipTests
mvn -pl dist -PnoSnapshots package -DskipTests
```
#### Building with buildall script
Expand All @@ -87,7 +87,7 @@ There is a build script `build/buildall` that automates the local build process.

By default, it builds everything that is needed to create a distribution jar for all released (noSnapshots) Spark versions except for Databricks. Other profiles that you can pass using `--profile=<distribution profile>` include
- `snapshots`
- `minimumFeatureVersionMix` that currently includes 311cdh, 312, 320 is recommended for catching incompatibilities already in the local development cycle
- `minimumFeatureVersionMix` that currently includes 321cdh, 312, 320 is recommended for catching incompatibilities already in the local development cycle

For initial quick iterations we can use `--profile=<buildver>` to build a single-shim version. e.g., `--profile=311` for Spark 3.1.1.

Expand Down Expand Up @@ -118,8 +118,8 @@ The following acronyms may appear in directory names:

|Acronym|Definition |Example|Example Explanation |
|-------|------------|-------|----------------------------------------------|
|cdh |Cloudera CDH|311cdh |Cloudera CDH Spark based on Apache Spark 3.1.1|
|db |Databricks |312db |Databricks Spark based on Spark 3.1.2 |
|cdh |Cloudera CDH|321cdh |Cloudera CDH Spark based on Apache Spark 3.2.1|

The version-specific directory names have one of the following forms / use cases:
- `src/main/312/scala` contains Scala source code for a single Spark version, 3.1.2 in this case
Expand Down
2 changes: 1 addition & 1 deletion docs/additional-functionality/rapids-shuffle.md
Original file line number Diff line number Diff line change
Expand Up @@ -281,11 +281,11 @@ In this section, we are using a docker container built using the sample dockerfi
| Spark Shim | spark.shuffle.manager value |
| --------------- | -------------------------------------------------------- |
| 3.1.1 | com.nvidia.spark.rapids.spark311.RapidsShuffleManager |
| 3.1.1 CDH | com.nvidia.spark.rapids.spark311cdh.RapidsShuffleManager |
| 3.1.2 | com.nvidia.spark.rapids.spark312.RapidsShuffleManager |
| 3.1.3 | com.nvidia.spark.rapids.spark313.RapidsShuffleManager |
| 3.2.0 | com.nvidia.spark.rapids.spark320.RapidsShuffleManager |
| 3.2.1 | com.nvidia.spark.rapids.spark321.RapidsShuffleManager |
| 3.2.1 CDH | com.nvidia.spark.rapids.spark321cdh.RapidsShuffleManager |
| Databricks 9.1 | com.nvidia.spark.rapids.spark312db.RapidsShuffleManager |
| Databricks 10.4 | com.nvidia.spark.rapids.spark321db.RapidsShuffleManager |

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -776,7 +776,7 @@ private object OrcTools extends Arm {
.withZeroCopy(zeroCopy)
.withMaxDiskRangeChunkLimit(maxDiskRangeChunkLimit)
.build())
reader.open() // 311cdh needs to initialize the internal FSDataInputStream file variable.
reader.open()
reader
}
}
Expand Down

0 comments on commit 7330393

Please sign in to comment.