Skip to content

Commit

Permalink
Do not retry on failure to fetch
Browse files Browse the repository at this point in the history
While we fixed the infinite retrying earlier, we still have
problems if we retry in the middle of a transfer, we might
end up resuming downloads that are already done and read
more than we should (removing the IsOpen() check so that
it always retries makes test-ubuntu-bug-1098738-apt-get-source-md5sum
fail with wrong file sizes).

I think the retrying was added to fixup pipelining messups,
but we have better solutions now, so let's get rid of it,
until we have implemented this properly.
  • Loading branch information
julian-klode committed Aug 10, 2020
1 parent 4b43920 commit fa37549
Showing 1 changed file with 13 additions and 20 deletions.
33 changes: 13 additions & 20 deletions methods/basehttp.cc
Original file line number Diff line number Diff line change
Expand Up @@ -770,31 +770,24 @@ int BaseHttpMethod::Loop()
}
else
{
if (Server->IsOpen() == false && FailCounter < 1)
if (not Server->IsOpen())
{
FailCounter++;
Server->Close();
_error->Discard();

// Reset the pipeline
QueueBack = Queue;
Server->PipelineAnswersReceived = 0;
continue;
}
else
{
Server->Close();
FailCounter = 0;
switch (Result)
{
case ResultState::TRANSIENT_ERROR:
Fail(true);
break;
case ResultState::FATAL_ERROR:
case ResultState::SUCCESSFUL:
Fail(false);
break;
}

Server->Close();
FailCounter = 0;
switch (Result)
{
case ResultState::TRANSIENT_ERROR:
Fail(true);
break;
case ResultState::FATAL_ERROR:
case ResultState::SUCCESSFUL:
Fail(false);
break;
}
}
break;
Expand Down

0 comments on commit fa37549

Please sign in to comment.