Тёмный

LSTM Recurrent Neural Network (RNN) | Explained in Detail 

Coding Lane
Подписаться 26 тыс.
Просмотров 56 тыс.
50% 1

LSTM Recurrent Neural Network is a special version of the RNN model. It stands for Long Short-Term Memory. The simple RNN has a problem that it cannot remember the context in a long sentence because it quickly loses information. And that is why Simple RNN has an only short-term memory.
LSTM has both long-term and short-term memory. It can store any contextual information for a long time.
LSTM has 2 internal states.
1.) Memory Cell State which acts as a long term memory
2.) Hidden State which acts as a short term memory
The main working components in LSTM are gates. There are 3 types of gates in LSTM:
1.) Forget Gate
2.) Input Gate
3.) Output Gate
In the video, we have understood LSTM Recurrent Neural Network in detail.
➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
Timestamps:
0:00 Intro
1:36 Problem with RNN
5:30 LSTM Overview
7:42 Forget Gate
10:39 Input Gate
13:39 Equations and other details
16:41 Summary of LSTM
18:23 LSTM through different times
19:01 End
➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
📕📄 Quiz: forms.gle/no29DhL1pF1dsFw28
➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
Follow my entire playlist on Recurrent Neural Network (RNN) :
📕 RNN Playlist: • What is Recurrent Neur...
➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
✔ CNN Playlist: • What is CNN in deep le...
✔ Complete Neural Network: • How Neural Networks wo...
✔ Complete Logistic Regression Playlist: • Logistic Regression Ma...
✔ Complete Linear Regression Playlist: • What is Linear Regress...
➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
If you want to ride on the Lane of Machine Learning, then Subscribe ▶ to my channel here: / @codinglane

Опубликовано:

 

1 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 68   
@ikraamhanif7966
@ikraamhanif7966 13 дней назад
Kash mene phly prh liya hota aj paper hai aur apki videos dekh kr itna achay sy smjh arha hai kiya hi bolun apko...Thank u so much for providing us knowledge like this
@CodingLane
@CodingLane 11 дней назад
Hi, I am elated from your words. Glad it was helpful.
@ivana_ftn
@ivana_ftn Год назад
You are better than my professor, thank you
@rajkamal1705
@rajkamal1705 Год назад
Very easy to understand. You are better than many prof. Thanks bro.
@MurodilDosmatov
@MurodilDosmatov 13 дней назад
Thousand of thanks for your effort to make this video tutorial
@CodingLane
@CodingLane 11 дней назад
Happy that it was valuable. Thanks for the compliment!
@jayhu6075
@jayhu6075 Год назад
I am very glad to find your channel. You make this topic for a beginner as me so understandable. Hopefully a following to write this in a python code. Many thanks.
@pavangoyal6840
@pavangoyal6840 Год назад
Excellent. Thank for this video and explaining complex concept like LSTM in very short and crisp video
@muhammadumarwaseem
@muhammadumarwaseem 5 месяцев назад
Thank you for the detailed to the point explanation.
@KulkarniPrashant
@KulkarniPrashant 3 месяца назад
Clear and concise explanation, thank you!
@Thing1Thing11
@Thing1Thing11 8 месяцев назад
Thank you so much! I was so lost and you really helped me get to grips with what is going on
@Bunches_of_Entertaiment
@Bunches_of_Entertaiment Год назад
Superb explanation brother.. thank you so much 😍.. I got very clear understand on LSTM and as well as RNN
@eng.mohamedemam6489
@eng.mohamedemam6489 Год назад
from a man from Egypt send big thanks to you ❤❤
@CodingLane
@CodingLane Год назад
Your welcome
@rezamohammadi1140
@rezamohammadi1140 6 месяцев назад
Very helpful because of mathematical explanation and summery in the last
@srishti6637
@srishti6637 7 месяцев назад
best explanation with no faltu pnchyti and made the topic crystal clear
@CodingLane
@CodingLane 7 месяцев назад
Thanks! 😁
@s8x.
@s8x. 3 месяца назад
brother u made learning machine learning so easy. When i got money i will be sure to show my thanks
@kavoshgar9733
@kavoshgar9733 4 месяца назад
You describe everything very well✌️
@user-st1gw5qc5b
@user-st1gw5qc5b Год назад
Worthy explanation!
@user-rr6ed2ng8b
@user-rr6ed2ng8b 4 месяца назад
Amazing- great work
@suhaibfarooq8253
@suhaibfarooq8253 24 дня назад
Very good explanation
@tridibeshmisra9426
@tridibeshmisra9426 8 месяцев назад
Best explanation...... It helped me for endsem exam...thank u sir.....keep creating ... let's get riding🙂🙂
@arvinflores5316
@arvinflores5316 2 года назад
That was a good binge man. Hopefully attention/transformers will be covered too!
@CodingLane
@CodingLane 2 года назад
Thanks for the suggestion... I will try to cover those too
@user_userovich
@user_userovich Год назад
Thank you very much!
@7aanusha885
@7aanusha885 Год назад
thank you for this video
@juditmaymo9714
@juditmaymo9714 Год назад
love this video!!
@vinayakmane7569
@vinayakmane7569 Год назад
remarkable explanation , keep bringing good content. just one little suggestion , try to write keypoints on board while explaining so that we can copy and it will help us while revising
@dmg8529
@dmg8529 Год назад
I am so happy now. Thanks!
@abhinavbuddala4620
@abhinavbuddala4620 Год назад
😂
@Animelover-oo7cz
@Animelover-oo7cz 3 месяца назад
you are the best thank youu
@slingshot7602
@slingshot7602 4 месяца назад
Tnx a lot
@deepurachakonda28
@deepurachakonda28 Год назад
u made my day..thnx lot
@CodingLane
@CodingLane Год назад
My pleasure 😊
@saatvikkool
@saatvikkool 2 года назад
You're doing a great job bro ✌️❤️
@CodingLane
@CodingLane 2 года назад
Thank you so much 😊
@this_is_the_baat
@this_is_the_baat 21 день назад
thanks my dear bro
@VenomOmpi
@VenomOmpi 2 месяца назад
thanks a lot!
@user-iv9bc1fs9c
@user-iv9bc1fs9c 8 месяцев назад
Thank you
@harshitraj8409
@harshitraj8409 2 года назад
Great Explanation
@CodingLane
@CodingLane 2 года назад
Thank you
@evildead3734
@evildead3734 2 года назад
Great resources 🙌
@CodingLane
@CodingLane 2 года назад
Thanks
@pothusandeepkomal1006
@pothusandeepkomal1006 2 года назад
thanks man …. very helpful … cheers !!!
@CodingLane
@CodingLane 2 года назад
Cheers!!
@pothusandeepkomal1006
@pothusandeepkomal1006 2 года назад
@@CodingLane could you please make videos on GRUs ,se2seq,Attention models, Transformers?
@CodingLane
@CodingLane 2 года назад
@@pothusandeepkomal1006 I am goong to upload those videos, but it will take some time. Right now, i am busy with my exams, so will upload when i get time. 😇
@shaveta01
@shaveta01 4 месяца назад
Good vdo Pl use white marker 😊instead of red
@pavangoyal6840
@pavangoyal6840 Год назад
For time series data (there are around 60 input variables) and there are two outputs variables. Which deep learning model would be best LSTM ? Here accuracy matters. For learning time does not matter. For 2 output variables how to design LSTM model ?
@charanm1773
@charanm1773 Год назад
Arigatto
@tahsinulkarim
@tahsinulkarim Год назад
❤❤❤
@martinant44
@martinant44 2 месяца назад
te quiero mucho
@mainakmukherjee3444
@mainakmukherjee3444 9 месяцев назад
Hello bhaiya, thanks for the informative contents. But can you please explain me why you are saying ft (forget gate) is a matrix. From the formula, it just an output of a sigmoid function, which I think a scalar value for each time step. Please explain this part. 🙏🙏
@user-pj9ue2ct8l
@user-pj9ue2ct8l 5 месяцев назад
Hi! i want to learn text detection from images using RNN. Please if you can help ???
@achreflegend4335
@achreflegend4335 7 дней назад
5:00 the green formula ; shouldn't it be a sum of multiplication through 0~t instead of just a sum
@rohitpol4963
@rohitpol4963 2 месяца назад
What is Bc, Bf, Bi, Bo added everywhere ?? Is that bias ?
@forsakennawfal546
@forsakennawfal546 24 дня назад
Biases or as we can say the Constants added in every function, indicating the margin of error.
@vishnusit1
@vishnusit1 Год назад
At 8:45 you are wrong about matrix multiplication
@beshosamir8978
@beshosamir8978 Год назад
Hi , i hope one day you explain transformers because ur explainations is great ,but i really need your help about something because i really got tired of searching about the answer i got stucked on something and seems like no one helps so i hope you help me , now i studied about LSTM and Bi-LSTM and i understood them well , but i read some blogs said that bi directional LSTM good for sentiment analysis and time series so i really got confused about it , How it could be useful !!!! it will be useful if my current prediction depends on what happens in the future ,so how it could be usefull in sentiment analysis if i already will predict my final output in the last word so there is no future because i stand in the last , i know it could be usefull in some applications like name entity recognation because the type of the output is (many) so maybe my current output depends on what is happend in the future i really hope to help me because i didn't find any reason after 2 hours of searching in google
@CodingLane
@CodingLane Год назад
Hi, bidirectional LSTM looks at all the words appearing in an input sentence, from both directions, front to back and back to front. So, you can always assume that bidirectional LSTM can be useful for any application involving sentences as input. I don’t have any more reason for this, for now.
@beshosamir8978
@beshosamir8978 Год назад
@@CodingLane So , is my question make sense ? or there's something i can not understood about the intuation for now ?
@tanvirtanvir6435
@tanvirtanvir6435 Год назад
3:22 problem with RNN
@aitoquantum
@aitoquantum 10 месяцев назад
the summation formula is wrong! one product sum should be there along with plain summation, that will unfold the longer terms...
@slingshot7602
@slingshot7602 4 месяца назад
GRU???
@shantanusingh2198
@shantanusingh2198 6 дней назад
Proably the worst video on lstm i have seen
@iftikharullah3616
@iftikharullah3616 7 месяцев назад
I watched many videos for this topic but couldn't understand it. You made every point clear in a beautiful way 🫡
Далее
Ne jamais regarder une fille à la plage 😂
00:10
Просмотров 837 тыс.
Long Short-Term Memory (LSTM), Clearly Explained
20:45
Просмотров 522 тыс.
LSTM Networks: Explained Step by Step!
34:48
Просмотров 21 тыс.
ML Was Hard Until I Learned These 5 Secrets!
13:11
Просмотров 254 тыс.
Recurrent Neural Networks (RNNs), Clearly Explained!!!
16:37