The spelled-out intro to neural networks and backpropagation: building micrograd
Andrej Kaparthy
Andrej presents a comprehensive lecture on training deep neural networks, explaining the creation and training of a neural network using his library, micrograd, and comparing it to PyTorch. Understanding and training neural networks requires mastering concepts like backpropagation, loss functions, and gradient descent. Taught by Andrej Kaparthy, a founding member of OpenAI and a ex-senior director of AI at Tesla.