-
Notifications
You must be signed in to change notification settings - Fork 232
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix CustomShuffleReader replacement when decimal types enabled #1685
Conversation
Signed-off-by: sperlingxx <lovedreamf@gmail.com>
build |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Copyrights need to be updated but otherwise lgtm.
@@ -31,6 +31,7 @@ import org.apache.spark.sql.execution.joins.SortMergeJoinExec | |||
import org.apache.spark.sql.functions.{col, when} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
2021 copyrights
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed.
@sperlingxx Can this impact other types too? We support shuffling nested types like arrays, maps, and structs, but GpuCustomShuffleReader does not. |
Thanks @sperlingxx . I applied this change to my branch and ran quite a few queries with decimal + AQE enabled and it worked well. |
Is there a way to make this fail in a way that we can fallback to the cpu? |
Hi @revans2, I added ArrayType and StructType as supported types. And MapType is not supported in |
I believe when we configure |
build |
docs/supported_ops.md
Outdated
@@ -9,7 +9,7 @@ support all data types. The RAPIDS Accelerator for Apache Spark has further | |||
restrictions on what types are supported for processing. This tries | |||
to document what operations are supported and what data types each operation supports. | |||
Because Apache Spark is under active development too and this document was generated | |||
against version 3.0.0 of Spark. Most of this should still | |||
against version 3.0.1 of Spark. Most of this should still |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The default is still to use 3.0.0 for building the docs. What happened to change it to 3.0.1 when the build itself did not change?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My bad. I ran mvn verify
with Dspark.version=3.0.1
to update the doc.
Signed-off-by: sperlingxx <lovedreamf@gmail.com>
build |
…IA#1685) Signed-off-by: sperlingxx <lovedreamf@gmail.com>
…IA#1685) Signed-off-by: sperlingxx <lovedreamf@gmail.com>
This pull request is to fix the bug of AQE with decimal, which is reported in issue #1670.