-
Notifications
You must be signed in to change notification settings - Fork 232
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] limit function is producing inconsistent result when type is Byte, Long, Boolean and Timestamp #1008
Comments
the ones I see failing are: Which are both INT96 types, so is this a dup of #1007 or were you seeing more fail? |
@tgravescs run against Spark 3.1.0. I will update the description. I see a lot more.
|
ok looks like it requires more then 1 executor or by default --master local[*] |
this is because you are writing different data on each iteration (cpu and gpu):
So the data is going to be different because wrote different data each time. |
…IDIA#1008) Signed-off-by: spark-rapids automation <70000568+nvauto@users.noreply.github.com>
Describe the bug
Using the limit inside a function that reads a parquet file results in inconsistent results.
Steps/Code to reproduce bug
Expected behavior
The above test should pass against Spark 3.0.0. and Spark 3.1.0
The text was updated successfully, but these errors were encountered: