-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ferocious memory usage #23
Comments
Thanks for using numbat. Yes, we've been investigating this elsewhere: #13 We hope to have a fix soon; we'll keep you updated. Best, Evan (Sorry, the close was a mistake with my cursor) |
Great thanks for looking into it Just for reference even for a low number of cells,
I am getting memory usage during the Evaluating CNVs... step: on iteration 1: I was able to force this to run with ~300gb of memory. |
Thanks, you're using the development version, correct? |
I am using main, should I try the devel branch? |
I think they're the same in terms of memory - currently trying to track down why this is |
Hi @cartographerJ it seems that we've found what causes this ferocious behavior. It turned out to be a bad interaction between OpenMP and mclapply (no clue why..). What solved the problem for us is to set |
Yes, please try again after reinstalling the package. I'm guessing the BLAS/LAPACK operations (especially with OpenBLAS) and data.table operations which causing the memory to surge. We've added some function calls which should explicitly set the OMP threads to 1. Let us know if this helps at all. |
will test tomorrow and report back |
I ran the devel version and seems to have fixed the issue! |
Terrific! Thanks for testing this. Credit to @evanbiederstedt for the fix |
During iteration 1, memory usage is pretty standard but during iteration 2, in the step "Evaluating CNV per cell .." memory usage goes through the roof. Is there any sort of workaround for this? Is this behavior expected?
The text was updated successfully, but these errors were encountered: