-
Notifications
You must be signed in to change notification settings - Fork 232
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add config for cast float to integral types #1413
Add config for cast float to integral types #1413
Conversation
Signed-off-by: Andy Grove <andygrove@nvidia.com>
build |
Signed-off-by: Andy Grove <andygrove@nvidia.com>
build |
I am rather confused by this. The issue is only relevant for Spark 3.1+ and also appears to only show up for ANSI cast, not the regular cast. If that is true can we make the config check only happen on Spark 3.1+ AnsiCast? I already moved AnsiCast to the shim layer for a similar issue so we should be able to subclass the meta to do this. Also it would be great to document what we do support instead of just what Spark supports. That way an end user can know if it is OK to turn it on or not. |
Signed-off-by: Andy Grove <andygrove@nvidia.com>
Signed-off-by: Andy Grove <andygrove@nvidia.com>
Thanks @revans2 I have partially addressed this but have some questions. I moved the checks to the 310 shim but I did this by overriding CastExprMeta in place. Would it be better to extend the CastChecks instead? Also, in this case it would be nice to be able to have the default setting differ depending on the Spark version being used but I don't think we have a mechanism for doing this. Do you think we should consider adding this? |
@@ -286,7 +299,7 @@ The following formats/patterns are supported on the GPU. Timezone of UTC is assu | |||
| `"tomorrow"` | Yes | | |||
| `"yesterday"` | Yes | | |||
|
|||
## String to Timestamp | |||
### String to Timestamp |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is unrelated but I thought it was worth fixing while I was here.
Signed-off-by: Andy Grove <andygrove@nvidia.com>
I like the way this has been done. No need to change I have not thought about configs that are dependent on the version of Spark being used at all. If we think we are going to run into more compatibility issues as time goes on that we cannot work around then it might be interesting to explore it, but for now I don't see a lot of value in it, assuming that we have explained clearly that a given config is only used if the version is .... |
shims/spark310/src/main/scala/com/nvidia/spark/rapids/shims/spark310/Spark310Shims.scala
Show resolved
Hide resolved
…later Signed-off-by: Andy Grove <andygrove@nvidia.com>
Signed-off-by: Andy Grove <andygrove@nvidia.com>
build |
* add config for cast float to int Signed-off-by: Andy Grove <andygrove@nvidia.com> * enable castFloatToIntegralTypes in nightly qa tests Signed-off-by: Andy Grove <andygrove@nvidia.com> * Move checks to 310 shim Signed-off-by: Andy Grove <andygrove@nvidia.com> * Improve docs Signed-off-by: Andy Grove <andygrove@nvidia.com> * docs update Signed-off-by: Andy Grove <andygrove@nvidia.com> * further clarification in the docs that this only apples to 3.1.0 and later Signed-off-by: Andy Grove <andygrove@nvidia.com> * Move Cast override to spark 300 shim for consistency Signed-off-by: Andy Grove <andygrove@nvidia.com>
* add config for cast float to int Signed-off-by: Andy Grove <andygrove@nvidia.com> * enable castFloatToIntegralTypes in nightly qa tests Signed-off-by: Andy Grove <andygrove@nvidia.com> * Move checks to 310 shim Signed-off-by: Andy Grove <andygrove@nvidia.com> * Improve docs Signed-off-by: Andy Grove <andygrove@nvidia.com> * docs update Signed-off-by: Andy Grove <andygrove@nvidia.com> * further clarification in the docs that this only apples to 3.1.0 and later Signed-off-by: Andy Grove <andygrove@nvidia.com> * Move Cast override to spark 300 shim for consistency Signed-off-by: Andy Grove <andygrove@nvidia.com>
…IDIA#1413) Signed-off-by: spark-rapids automation <70000568+nvauto@users.noreply.github.com>
Signed-off-by: Andy Grove andygrove@nvidia.com
As of Spark 3.1.0, we are no longer 100% compatible when casting floating point types to integral types due to changes in Spark. See docs in PR for more info.
Closes #1271