Skip to content

Commit

Permalink
Extend the Parquet writer's dictionary encoding benchmark. (#16591)
Browse files Browse the repository at this point in the history
This PR extends the data cardinality and run length range for the existing parquet writer's encoding benchmark.

Authors:
  - Muhammad Haseeb (https://github.com/mhaseeb123)

Approvers:
  - Vukasin Milovanovic (https://github.com/vuule)
  - Karthikeyan (https://github.com/karthikeyann)

URL: #16591
  • Loading branch information
mhaseeb123 authored Sep 10, 2024
1 parent 92f0197 commit f21979e
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions cpp/benchmarks/io/parquet/parquet_writer.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -202,8 +202,8 @@ NVBENCH_BENCH_TYPES(BM_parq_write_encode, NVBENCH_TYPE_AXES(d_type_list))
.set_name("parquet_write_encode")
.set_type_axes_names({"data_type"})
.set_min_samples(4)
.add_int64_axis("cardinality", {0, 1000})
.add_int64_axis("run_length", {1, 32});
.add_int64_axis("cardinality", {0, 1000, 10'000, 100'000})
.add_int64_axis("run_length", {1, 8, 32});

NVBENCH_BENCH_TYPES(BM_parq_write_io_compression, NVBENCH_TYPE_AXES(io_list, compression_list))
.set_name("parquet_write_io_compression")
Expand Down

0 comments on commit f21979e

Please sign in to comment.