You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Nov 22, 2022. It is now read-only.
remove move_state_dict_to_gpu, which is causing cuda oom (#1367)
Summary:
Pull Request resolved: #1367
I keep getting cuda oom in this load_best_model stage, move_state_dict_to_gpu and model.cuda() are not both needed. Looks like it will double gpu memory this way.
Reviewed By: anchit
Differential Revision: D21725316
fbshipit-source-id: 70b5761a25afb19da7f44a3fead37b36d0e122da
0 commit comments