Skip to content

Commit

Permalink
Fix column->row conversion GPU check: (#5190)
Browse files Browse the repository at this point in the history
Fixes #5181.

The change in #5122 modifies the row->column conversion code to run
differently on Pascal GPUs. It does so by checking the GPU architecture
to detect whether the current device is Pascal.
In a distributed setup, this check should run on the Spark executor. As it
currently stands, the check is erroneously run on the driver.

This fix postpones the GPU architecture check till the executor task attempts
to fetch result rows.

Signed-off-by: MithunR <mythrocks@gmail.com>
  • Loading branch information
mythrocks authored Apr 12, 2022
1 parent cb8f79d commit 398a07f
Showing 1 changed file with 11 additions and 7 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -369,17 +369,21 @@ object GpuColumnarToRowExecParent {
// This number includes the 1-bit validity per column, but doesn't include padding.
// We are being conservative by only allowing 100M columns until we feel the need to
// increase this number
output.length <= 100000000 &&
output.length <= 100000000) {
(batches: Iterator[ColumnarBatch]) => {
// UnsafeProjection is not serializable so do it on the executor side
val toUnsafe = UnsafeProjection.create(output, output)
// Work around {@link https://github.com/rapidsai/cudf/issues/10569}, where CUDF JNI
// acceleration of column->row transposition produces incorrect results on certain
// GPU architectures.
// Check that the accelerated transpose works correctly on the current CUDA device.
isAcceleratedTransposeSupported) {
(batches: Iterator[ColumnarBatch]) => {
// UnsafeProjection is not serializable so do it on the executor side
val toUnsafe = UnsafeProjection.create(output, output)
new AcceleratedColumnarToRowIterator(output, batches, numInputBatches, numOutputRows,
opTime, streamTime).map(toUnsafe)
if (isAcceleratedTransposeSupported) {
new AcceleratedColumnarToRowIterator(output, batches, numInputBatches, numOutputRows,
opTime, streamTime).map(toUnsafe)
} else {
new ColumnarToRowIterator(batches,
numInputBatches, numOutputRows, opTime, streamTime).map(toUnsafe)
}
}
} else {
(batches: Iterator[ColumnarBatch]) => {
Expand Down

0 comments on commit 398a07f

Please sign in to comment.