Skip to content

Commit

Permalink
update profile names in unit tests docs (#6140)
Browse files Browse the repository at this point in the history
Signed-off-by: Ahmed Hussein (amahussein) <a@ahussein.me>
  • Loading branch information
amahussein authored Jul 29, 2022
1 parent 6d93b02 commit 0fafd69
Show file tree
Hide file tree
Showing 3 changed files with 12 additions and 16 deletions.
4 changes: 2 additions & 2 deletions docs/dev/testing.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,5 +5,5 @@ nav_order: 2
parent: Developer Overview
---
An overview of testing can be found within the repository at:
* [Unit tests](https://github.com/NVIDIA/spark-rapids/tree/branch-0.4/tests)
* [Integration testing](https://github.com/NVIDIA/spark-rapids/tree/branch-0.4/integration_tests)
* [Unit tests](https://github.com/NVIDIA/spark-rapids/tree/branch-22.08/tests#readme)
* [Integration testing](https://github.com/NVIDIA/spark-rapids/tree/branch-22.08/integration_tests#readme)
2 changes: 1 addition & 1 deletion integration_tests/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ only a small number of Python dependencies that you need to install for the test
dependencies also only need to be on the driver. You can install them on all nodes
in the cluster but it is not required.

### Prerequisities
### Prerequisites

The build requires `OpenJDK 8`, `maven`, and `python`.
Skip to the next section if you have already installed them.
Expand Down
22 changes: 9 additions & 13 deletions tests/README.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,3 @@
---
layout: page
title: Testing
nav_order: 1
parent: Developer Overview
---
# RAPIDS Accelerator for Apache Spark Testing

We have a stand-alone example that you can run in the [integration tests](../integration_tests).
Expand All @@ -13,11 +7,11 @@ and the code is in the `com.nvidia.spark.rapids.tests.mortgage` package.

## Unit Tests

Unit tests exist in the [tests]() directory. This is unconventional and is done so we can run the
Unit tests exist in the [tests]() directory. This is unconventional and is done, so we can run the
tests on the final shaded version of the plugin. It also helps with how we collect code coverage.

The `tests` module depends on the `aggregator` module which shades dependencies. When running the
tests via `mvn test`, make sure to do an install via `mvn install` for the aggregator jar to the
tests via `mvn test`, make sure to run install command via `mvn install` for the aggregator jar to the
local maven repository.
The steps to run the unit tests:
```bash
Expand All @@ -37,17 +31,19 @@ For more information about using scalatest with Maven please refer to the
You can run the unit tests against different versions of Spark using the different profiles. The
default version runs against Spark 3.1.1, to run against a specific version use one of the following
profiles:
- `-Pspark311tests` (Spark 3.1.1)
- `-Pspark312tests` (Spark 3.1.2)
- `-Pspark313tests` (Spark 3.1.3)
- `-Prelease311` (Spark 3.1.1)
- `-Prelease321` (Spark 3.2.1)
- `-Prelease322` (Spark 3.2.2)
- `-Prelease330` (Spark 3.3.0)
- `-Prelease340` (Spark 3.4.0)

Please refer to the [tests project POM](pom.xml) to see the list of test profiles supported.
Apache Spark specific configurations can be passed in by setting the `SPARK_CONF` environment
variable.

Examples:
- To run tests against Apache Spark 3.1.1,
`mvn -P spark311tests test`
- To run tests against Apache Spark 3.2.1,
`mvn -Prelease321 test`
- To pass Apache Spark configs `--conf spark.dynamicAllocation.enabled=false --conf spark.task.cpus=1` do something like.
`SPARK_CONF="spark.dynamicAllocation.enabled=false,spark.task.cpus=1" mvn ...`
- To run test ParquetWriterSuite in package com.nvidia.spark.rapids, issue `mvn test -DwildcardSuites="com.nvidia.spark.rapids.ParquetWriterSuite"`
Expand Down

0 comments on commit 0fafd69

Please sign in to comment.