-
Notifications
You must be signed in to change notification settings - Fork 158
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
callbackFiller removed in 5.0.0 #417
Comments
In our situations, the database/cache queries are inside classes, with each class objects sometime requiring data that is shared with other objects. The previous I agree that in compute-only cases, So I understand you're not interested in re-implementing this and we should look at implementing this outside of the caching library? |
What does Rust multithreading have to do with this issue? Unhelpful responses IMO |
But Rust is not NodeJS, is still doesn't make sense to use that as an example.
That is simply not true. NodeJS has indeed a single threaded event loop, but the asynchronous nature of NodeJS allows it to process other things while waiting for an event/callback. That means that it is 100% possible to process multiple database queries at the (almost) same time while the previous one(s) is still not resolved. That is a trace on my current NodeJS application, where I do a Promise.all on a multi-cache wrap using the current version. See how I'm able to effectively do start a bunch of queries while the first one is not even completed, and hasn't got it's result back yet : If we don't coalesce the cache calls, we end up calling the cache database multiple times for the same exact query because we are indeed calling that cache multiple times before it's completed. It's especially relevant when you use a multi-layer-cache approach and use a memory cache. You would expect to only do a single query to the database, and then serve the object from memory, but what happens is just a bunch of queries to the database. |
Yes, the original reason for this was to handle a scenario like:
So the whole point is to avoid trying to make parallel calls to the cache or database for the same key. |
@Maelstrom96 My bad, you are right. |
Hi @jaredwray, I see this issue has been closed but we still experience this behavior change in As @Maelstrom96 described originally, without the "locking" or "coalescing" functionality of the Does the team intend to bring this functionality back or is this issue closed as "won't fix"? Thanks! |
@douglascayers - thanks for the response and at this time we do not plan to re-implement this. I am more than happy to look at a pull request around this. |
Thanks for the quick follow up! I'm working on porting the logic into our app, and would be happy to submit a pull request for your all's consideration for this project. |
That would be great. I would also love to see some unit tests that replicate this so we know we have a workable solution. |
✅ Step 1: create https://www.npmjs.com/package/promise-coalesce ✅ Step 2: create pull request to update the |
I would like to know if there was a particular reason why this feature was removed? I might be misunderstanding the current behavior of
.wrap()
in v5.0.0, but by looking at the code, it seems like there isn't any mechanism left to coalesce simultaneous cache wrap for the same key, unlike before withcallbackFiller
.If I'm not mistaken and there isn't indeed a mechanism currently that does that, would you be open of a PR that re-introduce a
callbackFiller
-like feature? We were about to update before noticing this, which would really affect our application with how we've expected node-cache-manager to work, locking us out of the v5.0.0 with the await/async and Typescript support.The text was updated successfully, but these errors were encountered: