Тёмный

RNN From Scratch In Python 

Dataquest
Подписаться 59 тыс.
Просмотров 22 тыс.
50% 1

We'll build a recurrent neural network (RNNs) in NumPy. RNNs can process sequences of data, like sentences. We'll start with the theory of RNNs, then build the forward and backward pass in NumPy.
You can find a text version of this video here - github.com/VikParuchuri/zero_... .
And all of the previous lessons here - github.com/VikParuchuri/zero_... .
Chapters
0:00 RNN overview
6:32 Step by step forward pass
15:10 tanh activation function
19:23 Full forward pass
22:59 Full backward pass
39:43 Complete implementation
This video is part of our new course, Zero to GPT - a guide to building your own GPT model from scratch. By taking this course, you'll learn deep learning skills from the ground up. Even if you're a complete beginner, you can start with the prerequisites we offer at Dataquest to get you started.
If you're dreaming of building deep learning models, this course is for you.
Best of all, you can access the course for free while it's still in beta!
Sign up today!
bit.ly/4016NfK

Опубликовано:

 

11 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 26   
@vikasparuchuri
@vikasparuchuri Год назад
Hi everyone! You can find the lesson for this video here - github.com/VikParuchuri/zero_to_gpt/blob/master/explanations/rnn.ipynb . And the full list of lessons in this series is here - github.com/VikParuchuri/zero_to_gpt .
@user-pb9nc4by2k
@user-pb9nc4by2k 4 месяца назад
Amazing. Every tutorial I've seen of RNNs is just an implementation of the RNN in pytorch or tensorflow with a quick and vague picture of a rolled and unrolled diagram (and this includes paid courses). This is the first video I've seen where I understand how the RNN could potentially process the incoming hidden layer data from the previous iteration.
@minkijung3
@minkijung3 9 месяцев назад
Thank you for this amazing tutorial. I learned a lot about RNN🙏🏻
@goodmusic284
@goodmusic284 Год назад
Thank you so much. This is by far the best explanation of RNNs I have seen.
@Dataquestio
@Dataquestio Год назад
Thanks a lot! I'm planning to release some more deep learning vids soon :) -Vik
@CarlosPfister
@CarlosPfister Год назад
Thanks for continuously offering up free content, even to non students
@wernerpjjacob6499
@wernerpjjacob6499 4 месяца назад
Very good didatic, very good man! I can only thank you
@user-xe2xd2qi4u
@user-xe2xd2qi4u 5 месяцев назад
Thanks for your high quality tutorial.
@gk4539
@gk4539 4 месяца назад
Just a note for any subsequent videos, if you were pointing on the screen it was not visible in the video, and it would be helpful if we knew where you were pointing to!
@phamquang5535
@phamquang5535 2 месяца назад
thanks, literal life saver
@BagusSulistyo
@BagusSulistyo Год назад
thanks this is awesome 🤟
@AurL_69
@AurL_69 4 месяца назад
thank you !!!!!
@nicolasndour9851
@nicolasndour9851 Год назад
Thanks for this presentation! Can i have a clear explainnation about the dimensions of i_weight,h_weight and o_weight? thanks for advance
@usernameispassword4023
@usernameispassword4023 24 дня назад
Shouldn't it be 1 - hiddens**2 for the tanh derivative?
@vubanchowdhury2204
@vubanchowdhury2204 Год назад
For multi-layer RNNs, isn't the output from the first layer supposed to be the input to the second layer and so on? From what I understand, the code is written in a way that multiple layers of RNNs will all take the same input sequence (from the original data) and not the output from the previous layer. Could you please elaborate on this?
@Dataquestio
@Dataquestio Год назад
Yeah, you're right - I was using single-layer RNNs in this video, so I didn't consider the multiple layer case closely. You would just need to adjust this loop to take in the previous layer input instead of x: for j in range(x.shape[0]): input_x = x[j,:][np.newaxis,:] @ i_weight hidden_x = input_x + hidden[max(j-1,0),:][np.newaxis,:] @ h_weight + h_bias
@vubanchowdhury2204
@vubanchowdhury2204 Год назад
@@Dataquestio Thanks for clarifying!
@waisyousofi9139
@waisyousofi9139 Год назад
Thanks , Where can I get its next video I mean where is the testing step where we can provide our input data.
@anfedoro
@anfedoro Год назад
which tool do you use to draw such fancy diagrams ? 😀.
@Dataquestio
@Dataquestio Год назад
I use a tool called Excalidraw! Highly recommend it.
@anfedoro
@anfedoro Год назад
@@Dataquestio Thanks.. I have installed Excalidraw extension in my vscode and drawing right there with no requirement to use online web tool.
@jonathanlowe2552
@jonathanlowe2552 Год назад
Can you please indicate where the csv file is found?
@Dataquestio
@Dataquestio Год назад
It's in the code I linked to - github.com/VikParuchuri/zero_to_gpt/blob/master/explanations/rnn.ipynb . If you check in the data folder (same directory it is opened from in the notebook), you'll find it - github.com/VikParuchuri/zero_to_gpt/tree/master/data .
@jonathanlowe2552
@jonathanlowe2552 Год назад
@@Dataquestio Thank you!
@wernerpjjacob6499
@wernerpjjacob6499 4 месяца назад
@kadenabet
@kadenabet 9 месяцев назад
I found this insightgul but very irritated with python nd its eccentricities. For instance int he implementation section, what is params doing there? It looks like a completely useless variable. Should you not update the layers?
Далее
Backpropagation In Depth
57:02
Просмотров 3 тыс.
Classification With Neural Networks
43:19
Просмотров 7 тыс.
Телеграмм-Колян Карелия
00:14
Просмотров 233 тыс.
Would you help?!😳
00:32
Просмотров 6 млн
Let's build GPT: from scratch, in code, spelled out.
1:56:20
MIT Introduction to Deep Learning | 6.S191
1:09:58
Просмотров 348 тыс.
Gradient Descent From Scratch In Python
42:39
Просмотров 17 тыс.
Recurrent Neural Networks (RNNs), Clearly Explained!!!
16:37