-
Notifications
You must be signed in to change notification settings - Fork 10.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(gatsby-source-filesystem): allow adjusting createRemoteFileNode
retry/timeout settings via env vars
#24535
Conversation
For developers with awful internet connection (like me), when using createRemoteFileNode, default timeouts of 30000ms are too short. Out of 300 files, it'd fail 1-4 images due to timeout and mess up the build process. Thus, it would be useful to overwrite default remote file download settings (retry limits and timeouts) with .env variables. For my specific case, just doubling all the values solved all issues.
removed ";"s
What values you are using now? I wonder if we should just not bump "defaults", so this can be more failure-proof out of the box for flaky connections. |
That said - I do see value in users being able to define their own, so I'm overall OK with this, but we should document it somewhere in We have note:
It is currently bit weirdly placed in "Options" section, but it's really just "option" for |
This can also fix #24856 |
This should be it then. Do I need to expand the documentation or change the defaults? GATSBY_CONCURRENT_DOWNLOAD=2 (could probably bump it to 10 now) GATSBY_STALL_RETRY_LIMIT=6 |
A related PR resolved the timeout issue I was hitting, which seemed to be from the timeout being before the requests were actually made/sent. I still had trouble getting all images to consistently download(only 24 images), triggering a retry failure. In my case, the retries weren't likely to be that helpful, I have a ~10Mbps(~1MB/sec) download connection over wifi(G or N, it's an old device from 2008). Default concurrent requests was too much, I only tried the example of It would probably be a useful feature if failures lowered the allowed concurrent requests(eg halving, |
Setting GATSBY_CONCURRENT_DOWNLOAD didn't solve my issue, which is why I had to add these env variables. Even when set to 1, it would miss some files, I really don't know why but every time for me some files timeout in 30s, but all work file with timeout set to 60. |
@webbugt yes, the related PR I linked to helps address that issue. It delays the timeout timer from starting until the request is made, not when it's added the queue(something like that). It'd be great if either one of these PRs got merged though. |
createRemoteFileNode
retry/timeout settings via env vars
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's get this in. Solutions discussed here ( auto adjusting concurrency settings in response to failures ) sounds good, but this can act as kind of stopgap solution
Holy buckets, @webbugt — we just merged your PR to Gatsby! 💪💜 Gatsby is built by awesome people like you. Let us say “thanks” in two ways:
If there’s anything we can do to help, please don’t hesitate to reach out to us: tweet at @gatsbyjs and we’ll come a-runnin’. Thanks again! |
…` retry/timeout settings via env vars (gatsbyjs#24535) * (gatsby-source-filesystem) Remote File Node env variables For developers with awful internet connection (like me), when using createRemoteFileNode, default timeouts of 30000ms are too short. Out of 300 files, it'd fail 1-4 images due to timeout and mess up the build process. Thus, it would be useful to overwrite default remote file download settings (retry limits and timeouts) with .env variables. For my specific case, just doubling all the values solved all issues. * Update create-remote-file-node.js removed ";"s * updated readme * update Co-authored-by: Michal Piechowiak <misiek.piechowiak@gmail.com>
Description
For developers with awful internet connection (like me), when using createRemoteFileNode, default timeouts of 30000ms are too short. Out of 300 files, it'd fail 1-4 images due to timeout and mess up the build process.
Thus, it would be useful to overwrite default remote file download settings (retry limits and timeouts) with .env variables.
For my specific case, just doubling all the values solved all issues.