Skip to content

Adding Adam optimiser #1460

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Aug 17, 2022
Merged

Adding Adam optimiser #1460

merged 7 commits into from
Aug 17, 2022

Conversation

MichaelClerx
Copy link
Member

@MichaelClerx MichaelClerx commented Aug 16, 2022

See #1105

Another adaptive local optimiser

https://doi.org/10.48550/arXiv.1412.6980

The equations aren't obvious, but you can check them against Algorithm 1 in the paper linked above.
The magic numbers in this one could be hyperparameters, but I'm not sure anyone will want to tweak them...

@MichaelClerx MichaelClerx marked this pull request as ready for review August 16, 2022 15:56
@codecov
Copy link

codecov bot commented Aug 16, 2022

Codecov Report

Merging #1460 (ac1e520) into master (9acb238) will not change coverage.
The diff coverage is 100.00%.

@@            Coverage Diff            @@
##            master     #1460   +/-   ##
=========================================
  Coverage   100.00%   100.00%           
=========================================
  Files           96        97    +1     
  Lines         9370      9449   +79     
=========================================
+ Hits          9370      9449   +79     
Impacted Files Coverage Δ
pints/__init__.py 100.00% <100.00%> (ø)
pints/_optimisers/_adam.py 100.00% <100.00%> (ø)
pints/_optimisers/_irpropmin.py 100.00% <100.00%> (ø)
pints/_transformation.py 100.00% <0.00%> (ø)

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

@MichaelClerx MichaelClerx requested a review from DavAug August 16, 2022 16:10
Copy link
Member

@DavAug DavAug left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @MichaelClerx , I particularly like the notebook and the impact of the step size on the result.

I just have some minor comments. I am happy for this to go in ☺️

Slightly off topic: I wonder whether the performance of Adam can be improved for our problems by adding noise to the gradient, since the algorithm was originally developed for stochastic objective functions and that might avoid getting stuck in tight locations of parameter space.

@MichaelClerx
Copy link
Member Author

Thanks @DavAug !

Slightly off topic: I wonder whether the performance of Adam can be improved for our problems by adding noise to the gradient, since the algorithm was originally developed for stochastic objective functions and that might avoid getting stuck in tight locations of parameter space.

Good idea. I tried it, and it does help if you get the noise just right. Not 100% sure though. Wouldn't recommend it :D

@MichaelClerx MichaelClerx merged commit f75911e into master Aug 17, 2022
@MichaelClerx MichaelClerx deleted the 1105-adam branch August 17, 2022 13:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants