Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move GpuWindowInPandasExec in shims layers #13

Closed
wants to merge 1 commit into from

Conversation

NvTimLiu
Copy link
Owner

@NvTimLiu NvTimLiu commented Sep 25, 2020

NVIDIA#844 Add the Spark 3.1.0 shim layer for GpuWindowInPandasExec

@NvTimLiu NvTimLiu force-pushed the add-udf-shims branch 3 times, most recently from 28c6428 to ca31b84 Compare September 25, 2020 09:35
@NvTimLiu NvTimLiu changed the title Move GpuWindowInPandasExec.scala in shims layers Move GpuWindowInPandasExec in shims layers Sep 25, 2020
@NvTimLiu NvTimLiu force-pushed the add-udf-shims branch 2 times, most recently from 843ad2d to ccef32d Compare September 25, 2020 09:39
import org.apache.spark.sql.internal.StaticSQLConf
import org.apache.spark.sql.rapids.{GpuFileSourceScanExec, ShuffleManagerShimBase}
import org.apache.spark.sql.rapids.execution.GpuBroadcastNestedLoopJoinExecBase
import org.apache.spark.sql.rapids.shims.spark310.{GpuInMemoryTableScanExec, ShuffleManagerShim}
import org.apache.spark.sql.rapids.shims.spark310._

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

any reason to change this line ?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add the new file : spark310/src/main/scala/org/apache/spark/sql/rapids/shims/spark310/GpuWindowInPandasExec.scala

@NvTimLiu NvTimLiu closed this Sep 25, 2020
@NvTimLiu NvTimLiu deleted the add-udf-shims branch September 25, 2020 22:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants