From 6f1426ee3aa4b97538ef4d3a1a4f3531581e6d96 Mon Sep 17 00:00:00 2001 From: Tim Liu Date: Tue, 18 Jan 2022 22:00:09 +0800 Subject: [PATCH] Fix typo [skip ci] (#4549) Signed-off-by: Tim Liu --- integration_tests/README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/integration_tests/README.md b/integration_tests/README.md index 6e3afeb84f6..921501e25b0 100644 --- a/integration_tests/README.md +++ b/integration_tests/README.md @@ -205,7 +205,7 @@ to the `local-path`, e.g. `LOCAL_JAR_PATH=local-path bash [run_pyspark_from_buil the shell-script [run_pyspark_from_build.sh](run_pyspark_from_build.sh) can find the test jars and resources in the alternate path. When running the shell-script [run_pyspark_from_build.sh](run_pyspark_from_build.sh) under YARN or Kubernetes, the `$SCRIPTPATH` in the python options -`--rootdir $SCRIPTPATH ...` and `--std_input_path $SCRIPTPATH ...` will not work, as the `$SCRIPTPATH` is a local path, you need to overwrite it to the clould paths. +`--rootdir $SCRIPTPATH ...` and `--std_input_path $SCRIPTPATH ...` will not work, as the `$SCRIPTPATH` is a local path, you need to overwrite it to the cloud paths. Basically, you need first to upload the test resources onto the cloud path `resource-path`, then transfer the test resources onto the working directory `root-dir` of each executor(e.g. via `spark-submit --files root-dir ...`). After that you must set both `LOCAL_ROOTDIR=root-dir` and `INPUT_PATH=resource-path` to run the shell-script, e.g. `LOCAL_ROOTDIR=root-dir INPUT_PATH=resource-path bash [run_pyspark_from_build.sh](run_pyspark_from_build.sh)`. @@ -448,7 +448,7 @@ Refer to the "Floating Point" section of [compatibility.md](../docs/compatibilit ### 8. Special values in timestamp columns Ensure date/timestamp columns include dates before the [epoch](https://en.wikipedia.org/wiki/Epoch_(computing)). -Apache Spark supports dates/timetamps between `0001-01-01 00:00:00.000000` and `9999-12-31 23:59:59.999999`, but at +Apache Spark supports dates/timestamps between `0001-01-01 00:00:00.000000` and `9999-12-31 23:59:59.999999`, but at values close to the minimum value, the format used in Apache Spark causes rounding errors. To avoid such problems, it is recommended that the minimum value used in a test not actually equal `0001-01-01`. For instance, `0001-01-03` is acceptable.