Skip to content

pytorch-optimizer v3.3.4

Compare
Choose a tag to compare
@kozistr kozistr released this 19 Jan 06:31
55c3553

Change Log

Feature

  • Support OrthoGrad feature for create_optimizer(). (#324)
  • Enhanced flexibility for the optimizer parameter in Lookahead, TRAC, and OrthoGrad optimizers. (#324)
    • Now supports both torch.optim.Optimizer instances and classes
    • You can now use Lookahead optimizer in two ways.
      • Lookahead(AdamW(model.parameters(), lr=1e-3), k=5, alpha=0.5)
      • Lookahead(AdamW, k=5, alpha=0.5, params=model.parameters())
  • Implement SPAM optimizer. (#324)
  • Implement TAM, and AdaTAM optimizers. (#325)