Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed Decimal 128 bug in ParquetCachedBatchSerializer #4899

Merged
merged 3 commits into from
Mar 8, 2022

Conversation

razajafri
Copy link
Collaborator

This PR fixes the way we were reading the parquet schema on the CPU. The problem was that the SparkToParquetSchemaConverter doesn't honor the 8-byte alignment, instead it changes the schema to the minimum bits needed to store the value which was causing PCBS to not read the entire value thus resulting in test failures

fixes #4826

Signed-off-by: Raza Jafri rjafri@nvidia.com

Signed-off-by: Raza Jafri <rjafri@nvidia.com>
@razajafri
Copy link
Collaborator Author

build

1 similar comment
@razajafri
Copy link
Collaborator Author

build

revans2
revans2 previously approved these changes Mar 7, 2022
Signed-off-by: Raza Jafri <rjafri@nvidia.com>
Signed-off-by: Raza Jafri <rjafri@nvidia.com>
@sameerz sameerz added the bug Something isn't working label Mar 8, 2022
@razajafri
Copy link
Collaborator Author

build

@jlowe jlowe added this to the Feb 28 - Mar 18 milestone Mar 8, 2022
@razajafri razajafri merged commit a0aeaba into NVIDIA:branch-22.04 Mar 8, 2022
@razajafri razajafri deleted the SR-4826-cache-test branch March 8, 2022 15:57
@firestarman
Copy link
Collaborator

firestarman commented Mar 9, 2022

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG] cache_test failures when testing with 128-bit decimal
6 participants