Skip to content

Commit

Permalink
Clamp the gzip fuzzer output even more aggressively.
Browse files Browse the repository at this point in the history
Even at O(N) scaling, the fuzzer infrastructure is kinda slow.

Bug: 940393
Change-Id: I0261e80e0cb24fbcbced52cd7a1d1de7ad8af652
Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/1520883
Commit-Queue: David Benjamin <davidben@chromium.org>
Commit-Queue: Matt Menke <mmenke@chromium.org>
Auto-Submit: David Benjamin <davidben@chromium.org>
Reviewed-by: Matt Menke <mmenke@chromium.org>
Cr-Commit-Position: refs/heads/master@{#640465}
  • Loading branch information
davidben authored and Commit Bot committed Mar 13, 2019
1 parent 74379f8 commit 4078281
Showing 1 changed file with 2 additions and 3 deletions.
5 changes: 2 additions & 3 deletions net/filter/gzip_source_stream_fuzzer.cc
Original file line number Diff line number Diff line change
Expand Up @@ -24,10 +24,9 @@ extern "C" int LLVMFuzzerTestOneInput(const uint8_t* data, size_t size) {

// Gzip has a maximum compression ratio of 1032x. While, strictly speaking,
// linear, this means the fuzzer will often get stuck. Stop reading at a more
// modest compression ratio of 10x, or 2 MiB, whichever is larger. See
// modest compression ratio of 2x, or 512 KiB, whichever is larger. See
// https://crbug.com/921075.
size_t max_output =
std::max(10u * size, static_cast<size_t>(2 * 1024 * 1024));
size_t max_output = std::max(2u * size, static_cast<size_t>(512 * 1024));

const net::SourceStream::SourceType kGzipTypes[] = {
net::SourceStream::TYPE_GZIP, net::SourceStream::TYPE_DEFLATE};
Expand Down

0 comments on commit 4078281

Please sign in to comment.