Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update the error string for test_cast_neg_to_decimal_err on 330[databricks] #5784

Merged
merged 2 commits into from
Jun 8, 2022
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion integration_tests/src/main/python/arithmetic_ops_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@
import pytest

from asserts import assert_gpu_and_cpu_are_equal_collect, assert_gpu_and_cpu_error, assert_gpu_fallback_collect, assert_gpu_and_cpu_are_equal_sql
from conftest import is_databricks_runtime
from data_gen import *
from marks import ignore_order, incompat, approximate_float, allow_non_gpu
from pyspark.sql.types import *
Expand Down Expand Up @@ -301,7 +302,9 @@ def test_mod_pmod_by_zero(data_gen, overflow_exp):
def test_cast_neg_to_decimal_err():
# -12 cannot be represented as decimal(7,7)
data_gen = _decimal_gen_7_7
exception_content = "Decimal(compact,-120000000,20,0}) cannot be represented as Decimal(7, 7)"
dec_value = "Decimal(compact,-120000000,20,0})" if is_before_spark_330() \
firestarman marked this conversation as resolved.
Show resolved Hide resolved
or is_databricks_runtime() else "Decimal(compact, -120000000, 20, 0)"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will this check fail again if databricks has a version of spark 330?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe yes

exception_content = dec_value + " cannot be represented as Decimal(7, 7)"
exception_str = "java.lang.ArithmeticException: " + exception_content if is_before_spark_330() \
and not is_databricks104_or_later() else "org.apache.spark.SparkArithmeticException: " \
+ exception_content
Expand Down