Skip to content

Commit

Permalink
Branch 0.5 doc remove numpartitions (#1951)
Browse files Browse the repository at this point in the history
* Remove --conf spark.sql.shuffle.partitions=10 from docs

Signed-off-by: Sameer Raheja <sraheja@nvidia.com>

* Remove spark.sql.shuffle.partitions=40 from get-started/yarn-gpu.md

Signed-off-by: Sameer Raheja <sraheja@nvidia.com>

* Remove spark.sql.shuffle.partitions from AWS EMR getting started gui

Signed-off-by: Sameer Raheja <sraheja@nvidia.com>
  • Loading branch information
sameerz authored Mar 19, 2021
1 parent 706c4ef commit a50ddfb
Show file tree
Hide file tree
Showing 3 changed files with 0 additions and 8 deletions.
1 change: 0 additions & 1 deletion docs/get-started/getting-started-aws-emr.md
Original file line number Diff line number Diff line change
Expand Up @@ -136,7 +136,6 @@ default settings:
"spark.rapids.memory.pinnedPool.size":"2G",
"spark.executor.memoryOverhead":"2G",
"spark.locality.wait":"0s",
"spark.sql.shuffle.partitions":"200",
"spark.sql.files.maxPartitionBytes":"256m",
"spark.sql.adaptive.enabled":"false"
}
Expand Down
6 changes: 0 additions & 6 deletions docs/get-started/getting-started-on-prem.md
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,6 @@ $SPARK_HOME/bin/spark-shell \
--conf spark.rapids.memory.pinnedPool.size=2G \
--conf spark.locality.wait=0s \
--conf spark.sql.files.maxPartitionBytes=512m \
--conf spark.sql.shuffle.partitions=10 \
--conf spark.plugins=com.nvidia.spark.SQLPlugin \
--jars ${SPARK_CUDF_JAR},${SPARK_RAPIDS_PLUGIN_JAR}
```
Expand Down Expand Up @@ -173,7 +172,6 @@ $SPARK_HOME/bin/spark-shell \
--conf spark.rapids.memory.pinnedPool.size=2G \
--conf spark.locality.wait=0s \
--conf spark.sql.files.maxPartitionBytes=512m \
--conf spark.sql.shuffle.partitions=10 \
--conf spark.plugins=com.nvidia.spark.SQLPlugin
```

Expand Down Expand Up @@ -226,7 +224,6 @@ $SPARK_HOME/bin/spark-shell \
--conf spark.rapids.memory.pinnedPool.size=2G \
--conf spark.locality.wait=0s \
--conf spark.sql.files.maxPartitionBytes=512m \
--conf spark.sql.shuffle.partitions=10 \
--conf spark.plugins=com.nvidia.spark.SQLPlugin \
--conf spark.executor.resource.gpu.discoveryScript=./getGpusResources.sh \
--files ${SPARK_RAPIDS_DIR}/getGpusResources.sh \
Expand All @@ -253,7 +250,6 @@ $SPARK_HOME/bin/spark-shell \
--conf spark.rapids.memory.pinnedPool.size=2G \
--conf spark.locality.wait=0s \
--conf spark.sql.files.maxPartitionBytes=512m \
--conf spark.sql.shuffle.partitions=10 \
--conf spark.plugins=com.nvidia.spark.SQLPlugin \
--conf spark.executor.resource.gpu.amount=1 \
--conf spark.executor.resource.gpu.discoveryScript=./getGpusResources.sh \
Expand Down Expand Up @@ -291,7 +287,6 @@ $SPARK_HOME/bin/spark-shell \
--conf spark.rapids.memory.pinnedPool.size=2G \
--conf spark.locality.wait=0s \
--conf spark.sql.files.maxPartitionBytes=512m \
--conf spark.sql.shuffle.partitions=10 \
--conf spark.plugins=com.nvidia.spark.SQLPlugin \
--conf spark.resources.discoveryPlugin=com.nvidia.spark.ExclusiveModeGpuDiscoveryPlugin \
--conf spark.executor.resource.gpu.amount=1 \
Expand Down Expand Up @@ -339,7 +334,6 @@ $SPARK_HOME/bin/spark-shell \
--conf spark.rapids.memory.pinnedPool.size=2G \
--conf spark.locality.wait=0s \
--conf spark.sql.files.maxPartitionBytes=512m \
--conf spark.sql.shuffle.partitions=10 \
--conf spark.plugins=com.nvidia.spark.SQLPlugin \
--conf spark.executor.resource.gpu.amount=1 \
--conf spark.executor.resource.gpu.discoveryScript=/opt/sparkRapidsPlugin/getGpusResources.sh \
Expand Down
1 change: 0 additions & 1 deletion docs/get-started/yarn-gpu.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,6 @@ spark.plugins=com.nvidia.spark.SQLPlugin
spark.executor.extraJavaOptions='-Dai.rapids.cudf.prefer-pinned=true'
spark.locality.wait=0s
spark.executor.resource.gpu.discoveryScript=/usr/lib/spark/scripts/gpu/getGpusResources.sh # this match the location of discovery script
spark.sql.shuffle.partitions=40
spark.sql.files.maxPartitionBytes=512m
```

Expand Down

0 comments on commit a50ddfb

Please sign in to comment.