Тёмный

Backpropagation In Depth 

Dataquest
Подписаться 59 тыс.
Просмотров 3 тыс.
50% 1

In this video, we'll take a deep dive into backpropagation to understand how data flows in a neural network. We'll learn how to break functions into operations, then use those operations to build a computational graph. At the end, we'll build a miniature PyTorch to implement a neural network.
Follow along with the code and written explanations here - github.com/VikParuchuri/zero_... .
Chapters:
00:00 Introduction
6:19 Staged softmax forward pass
9:24 Staged softmax backward pass
23:19 Analytic softmax
29:30 Softmax computational graph
42:26 Neural network computational graph
This video is part of our new course, Zero to GPT - a guide to building your own GPT model from scratch. By taking this course, you'll learn deep learning skills from the ground up. Even if you're a complete beginner, you can start with the prerequisites we offer at Dataquest to get you started.
If you're dreaming of building deep learning models, this course is for you.
Best of all, you can access the course for free while it's still in beta!
Sign up today!
bit.ly/4016NfK

Опубликовано:

 

11 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 4   
@vikasparuchuri
@vikasparuchuri Год назад
You can see the code and written explanations for this video here - github.com/VikParuchuri/zero_to_gpt/blob/master/explanations/comp_graph.ipynb . And the full course is here - github.com/VikParuchuri/zero_to_gpt .
@anfedoro
@anfedoro 11 месяцев назад
I actually found Andrej Karpathy 2 hours video for his micrograd library (part of NN: Zero to Hero course) and this is absolutly perfect explanations for once who wich to understand how NN behaves and specifically how back propagation works and all gradients are calculated on that backmove through the calculation graph. With all respect to Vik, I considered Andrej's explanation a bit more clear for understading for newbies also with bunch of general Python coding technics I was not familar with.
@Dataquestio
@Dataquestio 11 месяцев назад
Of course, Andrej Karpathy makes excellent tutorials, and I'm glad you found it useful. I'll think about how I can improve the backpropagation explanations in this video (maybe I'll make a second video, since Karpathy focuses more on each element of the tensors, whereas I focus more on the operations). I made this video since the knowledge in it is useful later in the zero to gpt series. Thanks for the pointer.
@hussainsalih3520
@hussainsalih3520 7 месяцев назад
Keep doing :)
Далее
Math And NumPy Fundamentals For Deep Learning
43:27
Просмотров 12 тыс.
6 ChatGPT Skills You Need to Learn
0:59
Просмотров 889
Classification With Neural Networks
43:19
Просмотров 7 тыс.
GEOMETRIC DEEP LEARNING BLUEPRINT
3:33:23
Просмотров 171 тыс.
Let's build GPT: from scratch, in code, spelled out.
1:56:20
The Forward-Forward Algorithm
8:52
Просмотров 16 тыс.
Project Walkthrough: Building a Power BI App
43:24
Просмотров 4,5 тыс.
Gradient Descent From Scratch In Python
42:39
Просмотров 17 тыс.
Create a Deep Learning API with Python and FastAPI
34:16