Skip to content

Scaling Exponents Across Parameterizations and Optimizers #319

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
67372a opened this issue Jan 13, 2025 · 1 comment · Fixed by #321
Closed

Scaling Exponents Across Parameterizations and Optimizers #319

67372a opened this issue Jan 13, 2025 · 1 comment · Fixed by #321
Assignees
Labels
feature request Request features

Comments

@67372a
Copy link

67372a commented Jan 13, 2025

Paper: https://arxiv.org/abs/2407.05872
Code: See C.5. ADAM-ATAN2 CODE CHANGE section in the paper, as well as https://github.com/LoganBooker/prodigy-plus-schedule-free/blob/1d2cfa2fe692a828d46a5a29b9667ec924961ac7/prodigyplus/core_optimiser.py#L412, where the clipping threshold is scaling for Prodigy purposes, but not strictly part of the technique.

@67372a 67372a added the feature request Request features label Jan 13, 2025
@kozistr
Copy link
Owner

kozistr commented Jan 13, 2025

thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request Request features
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants