You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add a short test method that a single optimizer case saves checkpoints and the checkpoints are loadable
Add a flag which forces the optimizer to switch after a certain number of steps - useful for writing tests which check the behavior of the second optimizer
parser.add_argument('--second_optim_start_step', type=int, default=None, help='If set, switch to the second optimizer when stalled or at this step regardless of performance. Normally, the optimizer only switches when the dev scores have stalled for --max_steps_before_stop steps')
ifglobal_step-last_best_step>=args['max_steps_before_stop'] or (args['second_optim_start_step'] isnotNoneandglobal_step>=args['second_optim_start_step']):
269
271
logger.info("Switching to second optimizer: {}".format(args.get('second_optim', None)))
270
272
args["second_stage"] =True
271
273
# if the loader gets a model file, it uses secondary optimizer
@@ -274,7 +276,8 @@ def train(args):
274
276
logger.info('Reloading best model to continue from current local optimum')
0 commit comments