-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issue trying to use MoBY SLL pretrained model with SWIN T backbone #181
Comments
Was able to load a checkpoint somewhat by changing the loading code:
After these changes it loads but is missing keys find the log below:
|
I also found after these changes that when loading pre-trained models from this repo (swin_tiny_patch4_window7_224.pth) such as from ImageNet a similar log is shown
|
Ah, these are all re init'ed anyway and so it doesn't matter. I might open a PR to allow for use of MoBY pre-trained models. |
same issue when using MoBY pretrained models.. |
When trying to load the checkpoint after SLL pretraining (1 epoch to test) with MoBY I get this error after using the
--pretrained
flag pointing to the checkpoint (I've tried ckpt_epoch_0.pth and checkpoint.pth). I am trying to transfer self-supervised learning with the same backbone to this architecture.The text was updated successfully, but these errors were encountered: