You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
Looks like the tests still runs the @pandas_udf code even though we say skip the file.
==================================== ERRORS ====================================
______________ ERROR collecting src/main/python/udf_cudf_test.py _______________
ImportError while importing test module '/home/tgraves/workspace/spark-rapids-another/integration_tests/src/main/python/udf_cudf_test.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
src/main/python/udf_cudf_test.py:76: in
@pandas_udf('int')
/home/tgraves/runspace/spark310/spark-3.1.0-SNAPSHOT-bin-hadoop3.2/python/lib/pyspark.zip/pyspark/sql/pandas/functions.py:324: in pandas_udf
require_minimum_pyarrow_version()
/home/tgraves/runspace/spark310/spark-3.1.0-SNAPSHOT-bin-hadoop3.2/python/lib/pyspark.zip/pyspark/sql/pandas/utils.py:57: in require_minimum_pyarrow_version
"your version was %s." % (minimum_pyarrow_version, pyarrow.version))
E ImportError: PyArrow >= 1.0.0 must be installed; however, your version was 0.15.1.
Describe the bug
Looks like the tests still runs the @pandas_udf code even though we say skip the file.
==================================== ERRORS ====================================
______________ ERROR collecting src/main/python/udf_cudf_test.py _______________
ImportError while importing test module '/home/tgraves/workspace/spark-rapids-another/integration_tests/src/main/python/udf_cudf_test.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
src/main/python/udf_cudf_test.py:76: in
@pandas_udf('int')
/home/tgraves/runspace/spark310/spark-3.1.0-SNAPSHOT-bin-hadoop3.2/python/lib/pyspark.zip/pyspark/sql/pandas/functions.py:324: in pandas_udf
require_minimum_pyarrow_version()
/home/tgraves/runspace/spark310/spark-3.1.0-SNAPSHOT-bin-hadoop3.2/python/lib/pyspark.zip/pyspark/sql/pandas/utils.py:57: in require_minimum_pyarrow_version
"your version was %s." % (minimum_pyarrow_version, pyarrow.version))
E ImportError: PyArrow >= 1.0.0 must be installed; however, your version was 0.15.1.
reproduce using spark 3.1.0 and just:
$SPARK_HOME/bin/spark-submit --master local --jars cudf-0.16-SNAPSHOT-cuda10-1.jar,rapids-4-spark_2.12-0.3.0-SNAPSHOT.jar --conf spark.driver.extraJavaOptions=-Duser.timezone=GMT --conf spark.sql.session.timeZone=UTC --conf spark.executor.extraJavaOptions=-Duser.timezone=GMT runtests.py
The text was updated successfully, but these errors were encountered: