Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] test_broadcast_hash_join_constant_keys failed in databricks runtimes #11266

Closed
pxLi opened this issue Jul 27, 2024 · 1 comment · Fixed by #11268
Closed

[BUG] test_broadcast_hash_join_constant_keys failed in databricks runtimes #11266

pxLi opened this issue Jul 27, 2024 · 1 comment · Fixed by #11268
Assignees
Labels
bug Something isn't working

Comments

@pxLi
Copy link
Collaborator

pxLi commented Jul 27, 2024

Describe the bug
new case was added in #11244.

failed in all DB runtimes(11,12,13)

[2024-07-27T09:30:47.296Z] FAILED ../../src/main/python/join_test.py::test_broadcast_hash_join_constant_keys[Left][DATAGEN_SEED=1722066549, TZ=UTC] - py4j.protocol.Py4JJavaError: An error occurred while calling o324822.count.
[2024-07-27T09:30:47.296Z] FAILED ../../src/main/python/join_test.py::test_broadcast_hash_join_constant_keys[LeftAnti][DATAGEN_SEED=1722066549, TZ=UTC] - py4j.protocol.Py4JJavaError: An error occurred while calling o325323.count.
[2024-07-27T09:30:47.283Z] E                   : org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 6633.0 failed 1 times, most recent failure: Lost task 0.0 in stage 6633.0 (TID 19790) (ip-10-59-241-231.us-west-2.compute.internal executor driver): java.lang.AssertionError: ColumnVectors can't be null or empty
[2024-07-27T09:30:47.283Z] E                   	at ai.rapids.cudf.Table.<init>(Table.java:57)
[2024-07-27T09:30:47.283Z] E                   	at com.nvidia.spark.rapids.GpuColumnVector.from(GpuColumnVector.java:524)
[2024-07-27T09:30:47.283Z] E                   	at org.apache.spark.sql.rapids.execution.ConditionalNestedLoopJoinIterator.$anonfun$computeNumJoinRows$2(GpuBroadcastNestedLoopJoinExecBase.scala:226)
[2024-07-27T09:30:47.283Z] E                   	at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23)
[2024-07-27T09:30:47.283Z] E                   	at com.nvidia.spark.rapids.RmmRapidsRetryIterator$.withRestoreOnRetry(RmmRapidsRetryIterator.scala:272)
[2024-07-27T09:30:47.283Z] E                   	at org.apache.spark.sql.rapids.execution.ConditionalNestedLoopJoinIterator.$anonfun$computeNumJoinRows$1(GpuBroadcastNestedLoopJoinExecBase.scala:226)
[2024-07-27T09:30:47.283Z] E                   	at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23)
[2024-07-27T09:30:47.284Z] E                   	at com.nvidia.spark.rapids.RmmRapidsRetryIterator$NoInputSpliterator.next(RmmRapidsRetryIterator.scala:395)
[2024-07-27T09:30:47.284Z] E                   	at com.nvidia.spark.rapids.RmmRapidsRetryIterator$RmmRapidsRetryIterator.next(RmmRapidsRetryIterator.scala:613)
[2024-07-27T09:30:47.284Z] E                   	at com.nvidia.spark.rapids.RmmRapidsRetryIterator$RmmRapidsRetryAutoCloseableIterator.next(RmmRapidsRetryIterator.scala:517)
[2024-07-27T09:30:47.284Z] E                   	at com.nvidia.spark.rapids.RmmRapidsRetryIterator$.drainSingleWithVerification(RmmRapidsRetryIterator.scala:291)
[2024-07-27T09:30:47.284Z] E                   	at com.nvidia.spark.rapids.RmmRapidsRetryIterator$.withRetryNoSplit(RmmRapidsRetryIterator.scala:185)
[2024-07-27T09:30:47.284Z] E                   	at org.apache.spark.sql.rapids.execution.ConditionalNestedLoopJoinIterator.computeNumJoinRows(GpuBroadcastNestedLoopJoinExecBase.scala:225)
[2024-07-27T09:30:47.284Z] E                   	at com.nvidia.spark.rapids.SplittableJoinIterator.$anonfun$setupNextGatherer$2(AbstractGpuJoinIterator.scala:226)
[2024-07-27T09:30:47.284Z] E                   	at com.nvidia.spark.rapids.Arm$.withResource(Arm.scala:30)
[2024-07-27T09:30:47.284Z] E                   	at com.nvidia.spark.rapids.SplittableJoinIterator.$anonfun$setupNextGatherer$1(AbstractGpuJoinIterator.scala:225)
[2024-07-27T09:30:47.284Z] E                   	at com.nvidia.spark.rapids.GpuMetric.ns(GpuExec.scala:184)
[2024-07-27T09:30:47.284Z] E                   	at com.nvidia.spark.rapids.SplittableJoinIterator.setupNextGatherer(AbstractGpuJoinIterator.scala:225)
[2024-07-27T09:30:47.284Z] E                   	at com.nvidia.spark.rapids.AbstractGpuJoinIterator.hasNext(AbstractGpuJoinIterator.scala:102)
[2024-07-27T09:30:47.284Z] E                   	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
[2024-07-27T09:30:47.284Z] E                   	at com.nvidia.spark.rapids.AbstractProjectSplitIterator.hasNext(basicPhysicalOperators.scala:233)
[2024-07-27T09:30:47.284Z] E                   	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
[2024-07-27T09:30:47.284Z] E                   	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
[2024-07-27T09:30:47.284Z] E                   	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
[2024-07-27T09:30:47.284Z] E                   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:491)
[2024-07-27T09:30:47.284Z] E                   	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
[2024-07-27T09:30:47.284Z] E                   	at com.nvidia.spark.rapids.GpuMergeAggregateIterator.$anonfun$next$2(GpuAggregateExec.scala:766)
[2024-07-27T09:30:47.284Z] E                   	at scala.Option.getOrElse(Option.scala:189)
[2024-07-27T09:30:47.284Z] E                   	at com.nvidia.spark.rapids.GpuMergeAggregateIterator.next(GpuAggregateExec.scala:762)
[2024-07-27T09:30:47.284Z] E                   	at com.nvidia.spark.rapids.GpuMergeAggregateIterator.next(GpuAggregateExec.scala:718)
[2024-07-27T09:30:47.284Z] E                   	at scala.collection.Iterator$$anon$10.next(Iterator.scala:461)
[2024-07-27T09:30:47.284Z] E                   	at com.nvidia.spark.rapids.DynamicGpuPartialSortAggregateIterator.$anonfun$next$10(GpuAggregateExec.scala:2124)
[2024-07-27T09:30:47.284Z] E                   	at scala.Option.map(Option.scala:230)
[2024-07-27T09:30:47.284Z] E                   	at com.nvidia.spark.rapids.DynamicGpuPartialSortAggregateIterator.next(GpuAggregateExec.scala:2124)
[2024-07-27T09:30:47.284Z] E                   	at com.nvidia.spark.rapids.DynamicGpuPartialSortAggregateIterator.next(GpuAggregateExec.scala:1975)
[2024-07-27T09:30:47.284Z] E                   	at org.apache.spark.sql.rapids.execution.GpuShuffleExchangeExecBase$$anon$1.partNextBatch(GpuShuffleExchangeExecBase.scala:333)
[2024-07-27T09:30:47.284Z] E                   	at org.apache.spark.sql.rapids.execution.GpuShuffleExchangeExecBase$$anon$1.hasNext(GpuShuffleExchangeExecBase.scala:355)
[2024-07-27T09:30:47.284Z] E                   	at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:151)
[2024-07-27T09:30:47.284Z] E                   	at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
[2024-07-27T09:30:47.284Z] E                   	at org.apache.spark.scheduler.ShuffleMapTask.$anonfun$runTask$3(ShuffleMapTask.scala:81)
[2024-07-27T09:30:47.284Z] E                   	at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
[2024-07-27T09:30:47.284Z] E                   	at org.apache.spark.scheduler.ShuffleMapTask.$anonfun$runTask$1(ShuffleMapTask.scala:81)
[2024-07-27T09:30:47.284Z] E                   	at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
[2024-07-27T09:30:47.284Z] E                   	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
[2024-07-27T09:30:47.284Z] E                   	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
[2024-07-27T09:30:47.284Z] E                   	at org.apache.spark.scheduler.Task.doRunTask(Task.scala:174)
[2024-07-27T09:30:47.284Z] E                   	at org.apache.spark.scheduler.Task.$anonfun$run$4(Task.scala:137)
[2024-07-27T09:30:47.284Z] E                   	at com.databricks.unity.EmptyHandle$.runWithAndClose(UCSHandle.scala:126)
[2024-07-27T09:30:47.285Z] E                   	at org.apache.spark.scheduler.Task.$anonfun$run$1(Task.scala:137)
[2024-07-27T09:30:47.285Z] E                   	at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
[2024-07-27T09:30:47.285Z] E                   	at org.apache.spark.scheduler.Task.run(Task.scala:96)
[2024-07-27T09:30:47.285Z] E                   	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$13(Executor.scala:902)
[2024-07-27T09:30:47.285Z] E                   	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1697)
[2024-07-27T09:30:47.285Z] E                   	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:905)
[2024-07-27T09:30:47.285Z] E                   	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
[2024-07-27T09:30:47.285Z] E                   	at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
[2024-07-27T09:30:47.285Z] E                   	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:760)
[2024-07-27T09:30:47.285Z] E                   	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[2024-07-27T09:30:47.285Z] E                   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[2024-07-27T09:30:47.285Z] E                   	at java.lang.Thread.run(Thread.java:750)
[2024-07-27T09:30:47.285Z] E                   
[2024-07-27T09:30:47.285Z] E                   Driver stacktrace:
[2024-07-27T09:30:47.285Z] E                   	at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:3420)
[2024-07-27T09:30:47.285Z] E                   	at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:3342)
[2024-07-27T09:30:47.285Z] E                   	at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:3331)
[2024-07-27T09:30:47.285Z] E                   	at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
[2024-07-27T09:30:47.285Z] E                   	at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
[2024-07-27T09:30:47.285Z] E                   	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
[2024-07-27T09:30:47.285Z] E                   	at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:3331)
[2024-07-27T09:30:47.285Z] E                   	at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1439)
[2024-07-27T09:30:47.285Z] E                   	at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1439)
[2024-07-27T09:30:47.285Z] E                   	at scala.Option.foreach(Option.scala:407)
[2024-07-27T09:30:47.285Z] E                   	at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1439)
[2024-07-27T09:30:47.285Z] E                   	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:3631)
[2024-07-27T09:30:47.285Z] E                   	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:3569)
[2024-07-27T09:30:47.285Z] E                   	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:3557)
[2024-07-27T09:30:47.285Z] E                   	at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:51)
[2024-07-27T09:30:47.285Z] E                   Caused by: java.lang.AssertionError: ColumnVectors can't be null or empty
[2024-07-27T09:30:47.285Z] E                   	at ai.rapids.cudf.Table.<init>(Table.java:57)
[2024-07-27T09:30:47.285Z] E                   	at com.nvidia.spark.rapids.GpuColumnVector.from(GpuColumnVector.java:524)
[2024-07-27T09:30:47.285Z] E                   	at org.apache.spark.sql.rapids.execution.ConditionalNestedLoopJoinIterator.$anonfun$computeNumJoinRows$2(GpuBroadcastNestedLoopJoinExecBase.scala:226)
[2024-07-27T09:30:47.285Z] E                   	at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23)
[2024-07-27T09:30:47.285Z] E                   	at com.nvidia.spark.rapids.RmmRapidsRetryIterator$.withRestoreOnRetry(RmmRapidsRetryIterator.scala:272)
[2024-07-27T09:30:47.285Z] E                   	at org.apache.spark.sql.rapids.execution.ConditionalNestedLoopJoinIterator.$anonfun$computeNumJoinRows$1(GpuBroadcastNestedLoopJoinExecBase.scala:226)
[2024-07-27T09:30:47.285Z] E                   	at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23)
[2024-07-27T09:30:47.285Z] E                   	at com.nvidia.spark.rapids.RmmRapidsRetryIterator$NoInputSpliterator.next(RmmRapidsRetryIterator.scala:395)
[2024-07-27T09:30:47.285Z] E                   	at com.nvidia.spark.rapids.RmmRapidsRetryIterator$RmmRapidsRetryIterator.next(RmmRapidsRetryIterator.scala:613)
[2024-07-27T09:30:47.285Z] E                   	at com.nvidia.spark.rapids.RmmRapidsRetryIterator$RmmRapidsRetryAutoCloseableIterator.next(RmmRapidsRetryIterator.scala:517)
[2024-07-27T09:30:47.285Z] E                   	at com.nvidia.spark.rapids.RmmRapidsRetryIterator$.drainSingleWithVerification(RmmRapidsRetryIterator.scala:291)
[2024-07-27T09:30:47.285Z] E                   	at com.nvidia.spark.rapids.RmmRapidsRetryIterator$.withRetryNoSplit(RmmRapidsRetryIterator.scala:185)
[2024-07-27T09:30:47.285Z] E                   	at org.apache.spark.sql.rapids.execution.ConditionalNestedLoopJoinIterator.computeNumJoinRows(GpuBroadcastNestedLoopJoinExecBase.scala:225)
[2024-07-27T09:30:47.286Z] E                   	at com.nvidia.spark.rapids.SplittableJoinIterator.$anonfun$setupNextGatherer$2(AbstractGpuJoinIterator.scala:226)
[2024-07-27T09:30:47.286Z] E                   	at com.nvidia.spark.rapids.Arm$.withResource(Arm.scala:30)
[2024-07-27T09:30:47.286Z] E                   	at com.nvidia.spark.rapids.SplittableJoinIterator.$anonfun$setupNextGatherer$1(AbstractGpuJoinIterator.scala:225)
[2024-07-27T09:30:47.286Z] E                   	at com.nvidia.spark.rapids.GpuMetric.ns(GpuExec.scala:184)
[2024-07-27T09:30:47.286Z] E                   	at com.nvidia.spark.rapids.SplittableJoinIterator.setupNextGatherer(AbstractGpuJoinIterator.scala:225)
[2024-07-27T09:30:47.286Z] E                   	at com.nvidia.spark.rapids.AbstractGpuJoinIterator.hasNext(AbstractGpuJoinIterator.scala:102)
[2024-07-27T09:30:47.286Z] E                   	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
[2024-07-27T09:30:47.286Z] E                   	at com.nvidia.spark.rapids.AbstractProjectSplitIterator.hasNext(basicPhysicalOperators.scala:233)
[2024-07-27T09:30:47.286Z] E                   	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
[2024-07-27T09:30:47.286Z] E                   	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
[2024-07-27T09:30:47.286Z] E                   	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
[2024-07-27T09:30:47.286Z] E                   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:491)
[2024-07-27T09:30:47.286Z] E                   	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
[2024-07-27T09:30:47.286Z] E                   	at com.nvidia.spark.rapids.GpuMergeAggregateIterator.$anonfun$next$2(GpuAggregateExec.scala:766)
[2024-07-27T09:30:47.286Z] E                   	at scala.Option.getOrElse(Option.scala:189)
[2024-07-27T09:30:47.286Z] E                   	at com.nvidia.spark.rapids.GpuMergeAggregateIterator.next(GpuAggregateExec.scala:762)
[2024-07-27T09:30:47.286Z] E                   	at com.nvidia.spark.rapids.GpuMergeAggregateIterator.next(GpuAggregateExec.scala:718)
[2024-07-27T09:30:47.286Z] E                   	at scala.collection.Iterator$$anon$10.next(Iterator.scala:461)
[2024-07-27T09:30:47.286Z] E                   	at com.nvidia.spark.rapids.DynamicGpuPartialSortAggregateIterator.$anonfun$next$10(GpuAggregateExec.scala:2124)
[2024-07-27T09:30:47.286Z] E                   	at scala.Option.map(Option.scala:230)
[2024-07-27T09:30:47.286Z] E                   	at com.nvidia.spark.rapids.DynamicGpuPartialSortAggregateIterator.next(GpuAggregateExec.scala:2124)
[2024-07-27T09:30:47.286Z] E                   	at com.nvidia.spark.rapids.DynamicGpuPartialSortAggregateIterator.next(GpuAggregateExec.scala:1975)
[2024-07-27T09:30:47.286Z] E                   	at org.apache.spark.sql.rapids.execution.GpuShuffleExchangeExecBase$$anon$1.partNextBatch(GpuShuffleExchangeExecBase.scala:333)
[2024-07-27T09:30:47.286Z] E                   	at org.apache.spark.sql.rapids.execution.GpuShuffleExchangeExecBase$$anon$1.hasNext(GpuShuffleExchangeExecBase.scala:355)
[2024-07-27T09:30:47.286Z] E                   	at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:151)
[2024-07-27T09:30:47.286Z] E                   	at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
[2024-07-27T09:30:47.286Z] E                   	at org.apache.spark.scheduler.ShuffleMapTask.$anonfun$runTask$3(ShuffleMapTask.scala:81)
[2024-07-27T09:30:47.286Z] E                   	at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
[2024-07-27T09:30:47.286Z] E                   	at org.apache.spark.scheduler.ShuffleMapTask.$anonfun$runTask$1(ShuffleMapTask.scala:81)
[2024-07-27T09:30:47.286Z] E                   	at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
[2024-07-27T09:30:47.286Z] E                   	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
[2024-07-27T09:30:47.286Z] E                   	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
[2024-07-27T09:30:47.286Z] E                   	at org.apache.spark.scheduler.Task.doRunTask(Task.scala:174)
[2024-07-27T09:30:47.286Z] E                   	at org.apache.spark.scheduler.Task.$anonfun$run$4(Task.scala:137)
[2024-07-27T09:30:47.286Z] E                   	at com.databricks.unity.EmptyHandle$.runWithAndClose(UCSHandle.scala:126)
[2024-07-27T09:30:47.286Z] E                   	at org.apache.spark.scheduler.Task.$anonfun$run$1(Task.scala:137)
[2024-07-27T09:30:47.286Z] E                   	at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
[2024-07-27T09:30:47.286Z] E                   	at org.apache.spark.scheduler.Task.run(Task.scala:96)
[2024-07-27T09:30:47.286Z] E                   	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$13(Executor.scala:902)
[2024-07-27T09:30:47.286Z] E                   	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1697)
[2024-07-27T09:30:47.286Z] E                   	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:905)
[2024-07-27T09:30:47.286Z] E                   	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
[2024-07-27T09:30:47.286Z] E                   	at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
[2024-07-27T09:30:47.287Z] E                   	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:760)
[2024-07-27T09:30:47.287Z] E                   	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[2024-07-27T09:30:47.287Z] E                   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[2024-07-27T09:30:47.287Z] E                   	at java.lang.Thread.run(Thread.java:750)

Steps/Code to reproduce bug
run the case in DB runtimes.

Expected behavior
Pass the test

Environment details (please complete the following information)

  • Environment location: [Standalone, YARN, Kubernetes, Cloud(specify cloud provider)]
  • Spark configuration settings related to the issue

Additional context
Add any other context about the problem here.

@pxLi pxLi added bug Something isn't working ? - Needs Triage Need team to review and classify labels Jul 27, 2024
@pxLi
Copy link
Collaborator Author

pxLi commented Jul 27, 2024

cc @jlowe to help

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants