Skip to content
This repository was archived by the owner on Oct 31, 2023. It is now read-only.

lower batch size : 2 labeled + 2 unlabeled -----> bad AP #53

Open
Saixiaoma opened this issue Sep 23, 2021 · 2 comments
Open

lower batch size : 2 labeled + 2 unlabeled -----> bad AP #53

Saixiaoma opened this issue Sep 23, 2021 · 2 comments

Comments

@Saixiaoma
Copy link

Thanks for the great work.
I have been training the code with your instruction in README.

Due to device limitations, there is only one GPU and only 12g of memory. In order to train, I can only adjust the batchsize to 2, which leads to very poor final results, with an AP of only about 11%.

I would like to ask whether batchsize has such a big impact on the result.

I'm looking forward to your reply !!

@hachreak
Copy link

Did you try to scale the learning rate?

@Saixiaoma
Copy link
Author

Yes. I conducted two more experiments afterwards. One was to repeat the experiment without modifying any parameters, and the other was to modify the learning rate 0.01 to 0.0025. The results of both times were much worse than the initial one.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants