In this video, we'll take a deep dive into backpropagation to understand how data flows in a neural network. We'll learn how to break functions into operations, then use those operations to build a computational graph. At the end, we'll build a miniature PyTorch to implement a neural network.
Follow along with the code and written explanations here - github.com/VikParuchuri/zero_... .
Chapters:
00:00 Introduction
6:19 Staged softmax forward pass
9:24 Staged softmax backward pass
23:19 Analytic softmax
29:30 Softmax computational graph
42:26 Neural network computational graph
This video is part of our new course, Zero to GPT - a guide to building your own GPT model from scratch. By taking this course, you'll learn deep learning skills from the ground up. Even if you're a complete beginner, you can start with the prerequisites we offer at Dataquest to get you started.
If you're dreaming of building deep learning models, this course is for you.
Best of all, you can access the course for free while it's still in beta!
Sign up today!
bit.ly/4016NfK
11 июл 2024