Open
Description
When trying to load the checkpoint after SLL pretraining (1 epoch to test) with MoBY I get this error after using the --pretrained
flag pointing to the checkpoint (I've tried ckpt_epoch_0.pth and checkpoint.pth). I am trying to transfer self-supervised learning with the same backbone to this architecture.
[2022-03-11 20:31:06 swin_tiny_patch4_window7_224](utils.py 47): INFO ==============> Loading weight **********/MOBY SSL SWIN/moby__swin_tiny__patch4_window7_224__odpr02_tdpr0_cm099_ct02_queue4096_proj2_pred2/default/ckpt_epoch_0.pth for fine-tuning......
Traceback (most recent call last):
File "/content/Swin-Transformer/main.py", line 357, in <module>
main(config)
File "/content/Swin-Transformer/main.py", line 131, in main
load_pretrained(config, model_without_ddp, logger)
File "/content/Swin-Transformer/utils.py", line 70, in load_pretrained
relative_position_bias_table_current = model.state_dict()[k]
KeyError: 'encoder.layers.0.blocks.0.attn.relative_position_bias_table'
Killing subprocess 3303```
Metadata
Metadata
Assignees
Labels
No labels