You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The issue exists on a clean installation of Fooocus
The issue exists in the current version of Fooocus
The issue has not been reported before recently
The issue has been reported before but has not been fixed yet
What happened?
Somehow I had this in my fooocus config, I was updating from a previous version.
"default_loras": [],
In the old version of fooocus, this did not prevent loras portion of the UI from showing, but in 2.2.0 it did.
First I did not realize the issue is config related, but after some discussion with @mashb1t, turned out it was 100% config related. Easy fix, but behavior was somewhat unexpected.
Steps to reproduce the problem
Add this to config on fooocus 2.2.0 config.txt and restart fooocus:
"default_loras": [],
What should have happened?
Loras UI should be displayed, but show all loras as empty / not selected.
What browsers do you use to access Fooocus?
Google Chrome
Where are you running Fooocus?
Locally
What operating system are you using?
Windows 10
Console logs
[System ARGV] ['G:\\Git\\StabilityMatrix\\Packages\\Fooocus\\launch.py']
Python 3.10.11 (tags/v3.10.11:7d4cc5a, Apr 5 2023, 00:38:17) [MSC v.1929 64 bit (AMD64)]
Fooocus version: 2.2.0
Total VRAM 12282 MB, total RAM 130983 MB
xformers version: 0.0.22.post4
Set vram state to: NORMAL_VRAM
Always offload VRAM
Device: cuda:0 NVIDIA GeForce RTX 4070 Ti : native
VAE dtype: torch.bfloat16
Using xformers cross attention
Refiner unloaded.
Running on local URL: http://127.0.0.1:7865
To create a public link, set`share=True`in`launch()`.model_type EPSUNet ADM Dimension 2816Using xformers attention in VAEWorking with z of shape (1, 4, 32, 32) = 4096 dimensions.Using xformers attention in VAEextra {'cond_stage_model.clip_g.logit_scale', 'cond_stage_model.clip_l.logit_scale', 'cond_stage_model.clip_l.transformer.text_model.embeddings.position_ids', 'cond_stage_model.clip_g.transformer.text_model.embeddings.position_ids', 'cond_stage_model.clip_l.text_projection'}Base model loaded: G:\Git\StabilityMatrix\Models\StableDiffusion\juggernautXL_v9Rundiffusionphoto2.safetensorsRequest to load LoRAs [] for model [G:\Git\StabilityMatrix\Models\StableDiffusion\juggernautXL_v9Rundiffusionphoto2.safetensors].Fooocus V2 Expansion: Vocab with 642 words.Fooocus Expansion engine loaded for cuda:0, use_fp16 = True.Requested to load SDXLClipModelRequested to load GPT2LMHeadModelLoading 2 new models[Fooocus Model Management] Moving model(s) has taken 0.29 secondsStarted worker with PID 3960App started successful. Use the app with http://127.0.0.1:7865/ or 127.0.0.1:7865
Additional information
I have not updated my GPU driver recently. The only thing changed was updating fooocus to 2.2.0.
The text was updated successfully, but these errors were encountered:
This could be prevented by adding "default_max_lora_number": 5, to the default.json preset, but generally speaking the default.json already includes default_loras, which is the fallback if default_max_lora_number is not set.
Not sure if really bug or only individual issue. Let's re-evaluate this in a few days.
Thanks for reporting!
@mashb1t Even if it's an edge case, if setting it to [] is not wrong, then fooocus should handle it. And if this/current behavior is intended, the UI should display a way to enable loras back. The logical connection from [] to missing loras in UI is not obvious. I would also argue that for any invalid config, the UI should revert to its default state. So if hiding loras is not intended using any config option, then there should be no way it happens like that. Meaning non-critical, but still a bug currently.
Checklist
What happened?
Somehow I had this in my fooocus config, I was updating from a previous version.
In the old version of fooocus, this did not prevent loras portion of the UI from showing, but in 2.2.0 it did.
First I did not realize the issue is config related, but after some discussion with @mashb1t, turned out it was 100% config related. Easy fix, but behavior was somewhat unexpected.
Steps to reproduce the problem
Add this to config on fooocus 2.2.0 config.txt and restart fooocus:
What should have happened?
Loras UI should be displayed, but show all loras as empty / not selected.
What browsers do you use to access Fooocus?
Google Chrome
Where are you running Fooocus?
Locally
What operating system are you using?
Windows 10
Console logs
Additional information
I have not updated my GPU driver recently. The only thing changed was updating fooocus to 2.2.0.
The text was updated successfully, but these errors were encountered: