Skip to content

rezaarezvan/RezvanGrad.jl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RezvanGrad.jl

A minimal automatic differentiation engine for Julia, inspired by micrograd

Installation

using Pkg
Pkg.add(url="https://github.com/rezaarezvan/RezvanGrad.jl")

Example

using RezvanGrad

# Create scalar values with autodiff tracking
x = Value(2.0)
y = Value(3.0)

# Build a computation graph
z = x^2 * y + y

# Perform backpropagation
backward(z)

# Access gradients
println("dz/dx: $(x.grad)")  # Should be 2 * 2 * 3 = 12
println("dz/dy: $(y.grad)")  # Should be x^2 + 1 = 5

About

RezvanGrad.jl - micrograd but in Julia and my way

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages