Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

callbackFiller removed in 5.0.0 #417

Closed
Maelstrom96 opened this issue Mar 27, 2023 · 10 comments · Fixed by #599
Closed

callbackFiller removed in 5.0.0 #417

Maelstrom96 opened this issue Mar 27, 2023 · 10 comments · Fixed by #599

Comments

@Maelstrom96
Copy link

I would like to know if there was a particular reason why this feature was removed? I might be misunderstanding the current behavior of .wrap() in v5.0.0, but by looking at the code, it seems like there isn't any mechanism left to coalesce simultaneous cache wrap for the same key, unlike before with callbackFiller.

If I'm not mistaken and there isn't indeed a mechanism currently that does that, would you be open of a PR that re-introduce a callbackFiller-like feature? We were about to update before noticing this, which would really affect our application with how we've expected node-cache-manager to work, locking us out of the v5.0.0 with the await/async and Typescript support.

@Maelstrom96
Copy link
Author

Maelstrom96 commented Mar 27, 2023

Promise.all has real world advantages when it comes to IO wait or DB query operations, where you are waiting and not doing anything - It's super easy to demonstrate too when tracing a request that uses Promise.all VS a normal for-each loop. With an await for-each, the execution would wait for a database query to resolve before processing with the next one, causing the execution time to be way higher.

In our situations, the database/cache queries are inside classes, with each class objects sometime requiring data that is shared with other objects. The previous callbackFiller function handled that use case perfectly for us and would coalesce those identical cache.wrap().

I agree that in compute-only cases, Promise.all actually slows things down with no real benefits, since you won't get more CPU cycles to do those compute tasks, and Promise.all will just add more overhead.

So I understand you're not interested in re-implementing this and we should look at implementing this outside of the caching library?

@zzau13 zzau13 closed this as completed Apr 23, 2023
@jonapgar-groupby
Copy link

What does Rust multithreading have to do with this issue?

Unhelpful responses IMO

@Maelstrom96
Copy link
Author

It's to explain that only one call is made at a time per thread.

But Rust is not NodeJS, is still doesn't make sense to use that as an example.

Therefore, if a parallel call occurs, it is because it has been called like this, calling the same function twice does not make sense in the same call. The only way to match 2 equal queries to the database is if they are called from different threads. This can't be since nodejs is a single threaded runtime.

That is simply not true. NodeJS has indeed a single threaded event loop, but the asynchronous nature of NodeJS allows it to process other things while waiting for an event/callback. That means that it is 100% possible to process multiple database queries at the (almost) same time while the previous one(s) is still not resolved.

That is a trace on my current NodeJS application, where I do a Promise.all on a multi-cache wrap using the current version. See how I'm able to effectively do start a bunch of queries while the first one is not even completed, and hasn't got it's result back yet :

image

If we don't coalesce the cache calls, we end up calling the cache database multiple times for the same exact query because we are indeed calling that cache multiple times before it's completed. It's especially relevant when you use a multi-layer-cache approach and use a memory cache. You would expect to only do a single query to the database, and then serve the object from memory, but what happens is just a bunch of queries to the database.

@BryanDonovan
Copy link
Collaborator

Yes, the original reason for this was to handle a scenario like:

  1. 10 simultaneous requests are received for user 123, which isn't cached yet.
  2. Instead of trying to fetch the same key from the cache 10 times in parallel, and have each one miss and then have to fetch from the database, we instead put the requests in a "callback filler" queue, so only one request actually calls the cache and database.
  3. When the single request comes back, it sends the data back in its callback and each of the other 9 callbacks.

So the whole point is to avoid trying to make parallel calls to the cache or database for the same key.

@zzau13 zzau13 reopened this Aug 28, 2023
@zzau13
Copy link
Contributor

zzau13 commented Aug 28, 2023

@Maelstrom96 My bad, you are right.

@douglascayers
Copy link
Contributor

douglascayers commented Sep 29, 2023

Hi @jaredwray, I see this issue has been closed but we still experience this behavior change in v5.2.3. We recently upgraded from v4.1.0.

As @Maelstrom96 described originally, without the "locking" or "coalescing" functionality of the CallbackFiller from the v4.x line, then when multiple calls to wrap for the same key occur close together then all those requests have their generator function invoked, which pummel databases and downstream apis.

Does the team intend to bring this functionality back or is this issue closed as "won't fix"?

Thanks!

@jaredwray
Copy link
Owner

@douglascayers - thanks for the response and at this time we do not plan to re-implement this. I am more than happy to look at a pull request around this.

@douglascayers
Copy link
Contributor

@douglascayers - thanks for the response and at this time we do not plan to re-implement this. I am more than happy to look at a pull request around this.

Thanks for the quick follow up! I'm working on porting the logic into our app, and would be happy to submit a pull request for your all's consideration for this project.

@jaredwray
Copy link
Owner

That would be great. I would also love to see some unit tests that replicate this so we know we have a workable solution.

@douglascayers
Copy link
Contributor

douglascayers commented Sep 30, 2023

✅ Step 1: create https://www.npmjs.com/package/promise-coalesce

✅ Step 2: create pull request to update the wrap function to use the coalescing logic

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants