[go: up one dir, main page]

Skip to content
#

backward-propagation

Here are 49 public repositories matching this topic...

I build the Micrograd autogradient engine, which is a functioning neural network with forward pass, backward propagation, and stochastic gradient descent, all built from scratch. This is derived from the great @karpathy micrograd lecture. Each notebook is complete with Andrei's lecture code and speech, as well as my own code, anecdotes and addition

  • Updated Jul 21, 2024
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the backward-propagation topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the backward-propagation topic, visit your repo's landing page and select "manage topics."

Learn more