Skip to content

cosmez/Emm.MicroGrad

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

A Port of micrograd

following the tutorial from The spelled-out intro to neural networks and backpropagation: building micrograd

Original Readme

A tiny Autograd engine (with a bite! :)). Implements backpropagation (reverse-mode autodiff) over a dynamically built DAG and a small neural networks library on top of it with a PyTorch-like API. Both are tiny, with about 100 and 50 lines of code respectively. The DAG only operates over scalar values, so e.g. we chop up each neuron into all of its individual tiny adds and multiplies. However, this is enough to build up entire deep neural nets doing binary classification, as the demo notebook shows. Potentially useful for educational purposes.

License

MIT

About

C# Clone of Micrograd

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages