Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Set spark.executor.cores for integration tests. #9177

Closed

Commits on Sep 1, 2023

  1. Set spark.executor.cores for integration tests.

    Fixes NVIDIA#9135. (By workaround.)
    
    This change sets `spark.executor.cores` to `10`, if it is unset. This allows
    integration tests to work around the failure seen in `parquet_test.py:test_small_file_memory`,
    where the `COALESCING` Parquet reader's thread pool accidentally uses 128 threads with 8MB memory
    each, thus consuming the entire heap.
    
    Note that this is a bit of a workaround.  A more robust solution would be to scale the Parquet
    reader's buffers based on the amount of available memory, and the number of threads.
    
    Signed-off-by: MithunR <mythrocks@gmail.com>
    mythrocks committed Sep 1, 2023
    Configuration menu
    Copy the full SHA
    97427b7 View commit details
    Browse the repository at this point in the history

Commits on Sep 5, 2023

  1. Configuration menu
    Copy the full SHA
    09c2b33 View commit details
    Browse the repository at this point in the history