-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[server / inferrer] prefetch configuration files #5529
Conversation
fdb007b
to
1c50864
Compare
/werft run 👍 started the job as gitpod-build-se-short-branch-name.2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Didn't notice any difference in terms of UX. LGTM! ✔️
Approving to unblock merging this but holding a) because of one minor typo in the comments and b) in case @AlexTugarev wants to take a closer look at the code. 🏓
/hold
LGTM label has been added. Git tree hash: d68c7a7ce5a760e435aede1e5136b5eb7255ba8e
|
Co-authored-by: George Tsiolis <tsiolis.g@gmail.com>
New changes are detected. LGTM label has been removed. |
/approve That make much sense! |
[APPROVALNOTIFIER] This PR is APPROVED This pull-request has been approved by: AlexTugarev, gtsiolis Associated issue: #5527 The full list of commands accepted by this bot can be found here. The pull request process is described here
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
const content = await fileProvider.getFileContent(commitContext, user, path); | ||
if (content) { | ||
cache[path] = content; | ||
return await cache[path]; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍🏻
/hold cancel |
Prefetch configuration files in parallel and in advanced based on a static cache.
Context: The inferred logic is based on a conditional tree, that asks about configuration files sequentially one by one. Requesting all those files in parallel should speed up the process.
fixes #5527