-
-
Notifications
You must be signed in to change notification settings - Fork 451
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
2 million data cache? #898
Comments
Hello curious contributor !
|
Short answer: Don't. Long answer: If caching 2 millions rows is the solution to some overload/timeout issue then you probably asking yourself the wrong question. |
Thanks. So, is partitioned storage possible? I'm trying to do something special. For example, there is a 200mb document, but I want to separate it into 3mb documents and combine them in the query. Is it possible? |
You don't need a cache, you either need a CDN or a document storage (Solr, AWS, or NoDB storage). |
Hello @eskoctr , I’m closing this issue for now because of (inactivity / outdated code / …). You can always reopen it though! :) Regards, |
What's your question ?
How can I cache a mysql table with 2 million rows?
References (optional)
No response
Do you have anything more you want to share? (optional)
No response
The text was updated successfully, but these errors were encountered: