Skip to content

Commit

Permalink
Fix typo [skip ci] (NVIDIA#4549)
Browse files Browse the repository at this point in the history
Signed-off-by: Tim Liu <timl@nvidia.com>
  • Loading branch information
NvTimLiu authored Jan 18, 2022
1 parent afdb2a8 commit 6f1426e
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions integration_tests/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -205,7 +205,7 @@ to the `local-path`, e.g. `LOCAL_JAR_PATH=local-path bash [run_pyspark_from_buil
the shell-script [run_pyspark_from_build.sh](run_pyspark_from_build.sh) can find the test jars and resources in the alternate path.

When running the shell-script [run_pyspark_from_build.sh](run_pyspark_from_build.sh) under YARN or Kubernetes, the `$SCRIPTPATH` in the python options
`--rootdir $SCRIPTPATH ...` and `--std_input_path $SCRIPTPATH ...` will not work, as the `$SCRIPTPATH` is a local path, you need to overwrite it to the clould paths.
`--rootdir $SCRIPTPATH ...` and `--std_input_path $SCRIPTPATH ...` will not work, as the `$SCRIPTPATH` is a local path, you need to overwrite it to the cloud paths.
Basically, you need first to upload the test resources onto the cloud path `resource-path`, then transfer the test resources onto the working directory
`root-dir` of each executor(e.g. via `spark-submit --files root-dir ...`). After that you must set both `LOCAL_ROOTDIR=root-dir` and `INPUT_PATH=resource-path`
to run the shell-script, e.g. `LOCAL_ROOTDIR=root-dir INPUT_PATH=resource-path bash [run_pyspark_from_build.sh](run_pyspark_from_build.sh)`.
Expand Down Expand Up @@ -448,7 +448,7 @@ Refer to the "Floating Point" section of [compatibility.md](../docs/compatibilit
### 8. Special values in timestamp columns
Ensure date/timestamp columns include dates before the [epoch](https://en.wikipedia.org/wiki/Epoch_(computing)).

Apache Spark supports dates/timetamps between `0001-01-01 00:00:00.000000` and `9999-12-31 23:59:59.999999`, but at
Apache Spark supports dates/timestamps between `0001-01-01 00:00:00.000000` and `9999-12-31 23:59:59.999999`, but at
values close to the minimum value, the format used in Apache Spark causes rounding errors. To avoid such problems,
it is recommended that the minimum value used in a test not actually equal `0001-01-01`. For instance, `0001-01-03` is
acceptable.
Expand Down

0 comments on commit 6f1426e

Please sign in to comment.