-
Notifications
You must be signed in to change notification settings - Fork 232
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update GpuDataSourceScanExec and GpuBroadcastExchangeExec to fix audit issues #1760
Changes from 1 commit
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -29,7 +29,7 @@ import com.nvidia.spark.rapids._ | |
import com.nvidia.spark.rapids.GpuMetric._ | ||
import com.nvidia.spark.rapids.RapidsPluginImplicits._ | ||
|
||
import org.apache.spark.SparkException | ||
import org.apache.spark.{SparkContext, SparkException} | ||
import org.apache.spark.broadcast.Broadcast | ||
import org.apache.spark.launcher.SparkLauncher | ||
import org.apache.spark.rdd.RDD | ||
|
@@ -265,7 +265,10 @@ abstract class GpuBroadcastExchangeExecBase( | |
@transient | ||
private val timeout: Long = SQLConf.get.broadcastTimeout | ||
|
||
val _runId: UUID = UUID.randomUUID() | ||
// Cancelling a SQL statement from Spark ThriftServer needs to cancel | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. yes :) There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. yeah generally speaking would be better to separate issues in separate prs. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. note its fine for now but something to keep in mind for future There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Thanks @tgravescs and @gerashegalov. Will post separate PR's for separate issues in the future. |
||
// its related broadcast sub-jobs. So set the run id to job group id if exists. | ||
val _runId: UUID = Option(sparkContext.getLocalProperty(SparkContext.SPARK_JOB_GROUP_ID)) | ||
.map(UUID.fromString).getOrElse(UUID.randomUUID) | ||
|
||
@transient | ||
lazy val relationFuture: Future[Broadcast[Any]] = { | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would rename sqlContext to be sqlConf here because you aren't passing the entire sqlContext in
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done