Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deflake test #8153

Conversation

illicitonion
Copy link
Contributor

The way backing in is implemented, we halve the backoff period per
successful request, rather than based on time. The expect_only_good
method makes an unpredictable number of requests, filling a time window
rather.

So this test would pass when the number of requests happened to align
with the time window (which appears to be common on travis's CPUs), and
fail if they got out of sync.

We currently have no tests for the actual backing in behaviour, but I
think that's mostly ok; we have tests that shows that we start using
servers again, and we can add more if we need to.

Fixes #7836

The way backing in is implemented, we halve the backoff period per
successful request, rather than based on time. The `expect_only_good`
method makes an unpredictable number of requests, filling a time window
rather.

So this test would pass when the number of requests happened to align
with the time window (which appears to be common on travis's CPUs), and
fail if they got out of sync.

We currently have no tests for the actual backing in behaviour, but I
think that's mostly ok; we have tests that shows that we start using
servers again, and we can add more if we need to.
@illicitonion illicitonion merged commit 721595e into pantsbuild:master Aug 12, 2019
@illicitonion illicitonion deleted the dwagnerhall/deflaking/backoff_when_unhealthy branch August 12, 2019 08:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

serverset::tests::backoff_when_unhealthy is flaky
4 participants