Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Spark 3.3 IT cache_test:test_passing_gpuExpr_as_Expr failure #4960

Closed
tgravescs opened this issue Mar 16, 2022 · 2 comments
Closed

[BUG] Spark 3.3 IT cache_test:test_passing_gpuExpr_as_Expr failure #4960

tgravescs opened this issue Mar 16, 2022 · 2 comments
Assignees
Labels
bug Something isn't working P0 Must have for release

Comments

@tgravescs
Copy link
Collaborator

Describe the bug

01:59:04  FAILED ../../src/main/python/cache_test.py::test_passing_gpuExpr_as_Expr[{'spark.sql.inMemoryColumnarStorage.enableVectorizedReader': 'true'}][ALLOW_NON_GPU(CollectLimitExec)]
01:59:04  FAILED ../../src/main/python/cache_test.py::test_passing_gpuExpr_as_Expr[{'spark.sql.inMemoryColumnarStorage.enableVectorizedReader': 'false'}][ALLOW_NON_GPU(CollectLimitExec)]

12:59:04  �[1m�[31mE                   : org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 4.0 failed 1 times, most recent failure: Lost task 0.0 in stage 4.0 (TID 26) (10.233.67.146 executor 0): java.lang.IllegalArgumentException: For input string: "null"�[0m
12:59:04  �[1m�[31mE                   	at scala.collection.immutable.StringLike.parseBoolean(StringLike.scala:330)�[0m
12:59:04  �[1m�[31mE                   	at scala.collection.immutable.StringLike.toBoolean(StringLike.scala:289)�[0m
12:59:04  �[1m�[31mE                   	at scala.collection.immutable.StringLike.toBoolean$(StringLike.scala:289)�[0m
12:59:04  �[1m�[31mE                   	at scala.collection.immutable.StringOps.toBoolean(StringOps.scala:33)�[0m
12:59:04  �[1m�[31mE                   	at org.apache.spark.sql.execution.datasources.parquet.SparkToParquetSchemaConverter.<init>(ParquetSchemaConverter.scala:455)�[0m
12:59:04  �[1m�[31mE                   	at org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport.init(ParquetWriteSupport.scala:114)�[0m
12:59:04  �[1m�[31mE                   	at com.nvidia.spark.rapids.shims.ParquetOutputFileFormat.getRecordWriter(ParquetCachedBatchSerializer.scala:1505)�[0m
12:59:04  �[1m�[31mE                   	at com.nvidia.spark.rapids.shims.ParquetCachedBatchSerializer$CachedBatchIteratorProducer$InternalRowToCachedBatchIterator.$anonfun$next$1(ParquetCachedBatchSerializer.scala:1247)�[0m
12:59:04  �[1m�[31mE                   	at org.apache.spark.sql.internal.SQLConf$.withExistingConf(SQLConf.scala:158)�[0m
12:59:04  �[1m�[31mE                   	at com.nvidia.spark.rapids.shims.ParquetCachedBatchSerializer$CachedBatchIteratorProducer$InternalRowToCachedBatchIterator.next(ParquetCachedBatchSerializer.scala:1247)�[0m
12:59:04  �[1m�[31mE                   	at com.nvidia.spark.rapids.shims.ParquetCachedBatchSerializer$CachedBatchIteratorProducer$InternalRowToCachedBatchIterator.next(ParquetCachedBatchSerializer.scala:1116)�[0m
12:59:04  �[1m�[31mE                   	at scala.collection.Iterator$$anon$10.next(Iterator.scala:461)�[0m
12:59:04  �[1m�[31mE                   	at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:224)�[0m
12:59:04  �[1m�[31mE                   	at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:302)�[0m
12:59:04  �[1m�[31mE                   	at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1508)�[0m
12:59:04  �[1m�[31mE                   	at org.apache.spark.storage.BlockManager.org$apache$spark$storage$BlockManager$$doPut(BlockManager.scala:1435)�[0m
12:59:04  �[1m�[31mE                   	at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1499)�[0m
12:59:04  �[1m�[31mE                   	at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:1322)�[0m
@tgravescs tgravescs added bug Something isn't working ? - Needs Triage Need team to review and classify P0 Must have for release labels Mar 16, 2022
@firestarman
Copy link
Collaborator

@razajafri
Copy link
Collaborator

razajafri commented Mar 22, 2022

unable to reproduce.

It's passing on blossom and locally on the latest snapshot of spark 3.3.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working P0 Must have for release
Projects
None yet
Development

No branches or pull requests

4 participants