Тёмный

Neural Networks from Scratch - P.4 Batches, Layers, and Objects 

sentdex
Подписаться 1,3 млн
Просмотров 316 тыс.
50% 1

Neural Networks from Scratch book: nnfs.io
NNFSiX Github: github.com/Sentdex/NNfSiX
Playlist for this series: • Neural Networks from S...
Neural Networks IN Scratch (the programming language): • Neural Networks in Scr...
Python 3 basics: pythonprogramming.net/introdu...
Intermediate Python (w/ OOP): pythonprogramming.net/introdu...
Mug link for fellow mug aficionados: amzn.to/2KFwsWn
Channel membership: / @sentdex
Discord: / discord
Support the content: pythonprogramming.net/support...
Twitter: / sentdex
Instagram: / sentdex
Facebook: / pythonprogramming.net
Twitch: / sentdex
#nnfs #python #neuralnetworks

Опубликовано:

 

9 июн 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 955   
@Yguy
@Yguy 4 года назад
I swear I am addicted to these more than Netflix.
@MasterofPlay7
@MasterofPlay7 4 года назад
lol you are weird xD
@usamanadeem7974
@usamanadeem7974 4 года назад
I literally wait for RU-vid’s notification telling me Sentdex dropped a new tutorial 😂
@ossamaganne5851
@ossamaganne5851 4 года назад
@@usamanadeem7974 me too brother ahahahah
@NicolasKaniak
@NicolasKaniak 4 года назад
same here
@Blahcub
@Blahcub 3 года назад
cringe
@AaditDoshi
@AaditDoshi 4 года назад
I don't even look at my calendar anymore. My week ends when sentdex drops a video.
@akiratoriyama1320
@akiratoriyama1320 4 года назад
My week starts when he drops video!!
@pushkarajpalnitkar1695
@pushkarajpalnitkar1695 4 года назад
@@akiratoriyama1320 countdown starts today for the next video
@josephastrahan6403
@josephastrahan6403 4 года назад
Agreed, I'm waiting patiently also :)
@shauryapatel8372
@shauryapatel8372 4 года назад
i think you might have to make new days cuz it is taking 2 weeks for his next vid
@soroushe6394
@soroushe6394 4 года назад
I’m glad I’m living at a time that people like you share their knowledge in such quality for free. Thank you 🙏🏻
@slamsandwich19
@slamsandwich19 Год назад
I was going to say the same thing
@MeinGoogleAccount
@MeinGoogleAccount 7 месяцев назад
yes, absolutely. i come from a time where programming meant you buy a book that acually was outdated the moment you bought it. thank you 🙂
@sayanguha5570
@sayanguha5570 4 года назад
Everytime I see a neural network tutorial they start as "import tensorflow as tf" without giving a shit about basic..but this is a very detailed basic clearing video, truly from scratch...THANK YOU FOR THE GOOD WORK
@lucygaming9726
@lucygaming9726 4 года назад
I agree with you, although you can check out Deeplearning.ai on Coursera. It's pretty good.
@aleksszukovskis2074
@aleksszukovskis2074 3 года назад
@@lucygaming9726 No thanks . Im too poor for that.
@janzugic6798
@janzugic6798 3 года назад
@@aleksszukovskis2074 its free and by andrew ng, the legend
@aleksszukovskis2074
@aleksszukovskis2074 3 года назад
@@janzugic6798 thanks
@supernova6553
@supernova6553 2 года назад
@@janzugic6798 you need a coursera subscription ($49/mo) after 7 day trial period regardless of the course being free
@sentdex
@sentdex 4 года назад
Errata: 16:17: initially this anim was incorrect when I recorded. We fixed the anim, but not the audio, resulting in my reading of the incorrect first row of values incorrectly. We're adding row vectors here, so the anim is correct, the words are not. =]
@usejasiri
@usejasiri 4 года назад
Please clarify the concept of the Gaussian Distribution that you introduced when talking about np.randn
@anjali7778
@anjali7778 4 года назад
if i draw a neural network of 12 inputs imaging into 3 output and connect each neurons to the output, there will be 36 lines in total, that means there has to be about 36 weights but the weight you took had only 12 weights in array, how is that possible ?
@mayaankashok2604
@mayaankashok2604 3 года назад
@@anjali7778 He has only 4 inputs to the output layer... therefore number of weights = 4*3 = 12 If instead , you have 12 inputs ,you will get 12*3 = 36 weights
@fincrazydragon
@fincrazydragon 11 месяцев назад
Am I wrong, or is there something missing around the 9:08 point?
@dragonborn7152
@dragonborn7152 10 месяцев назад
Question: why did we need to transpose weights 2 since they are both 3x3 matrices, index1 of the would equal index 0 right?
@Blendersky2
@Blendersky2 Год назад
Just imagine if we have tutorials like these on all the AI and Machine learning topics and also on probability and statistics. .. man, every few minutes in the video I try to scroll the video list up and down with the hope that there will be 700 more videos like these but it shows only 7 videos. Amazing work, I will order your book now. Appreciate your dedication and hard work
@amogh3275
@amogh3275 4 года назад
Bruh this visualisation... Its unreal🔥
@Saletroo
@Saletroo 4 года назад
ASMR for eyes, thanks Daniel!
@amogh3275
@amogh3275 4 года назад
@@Saletroo ikr 😂😂
@usamanadeem7974
@usamanadeem7974 4 года назад
The thing I love about you is just how beautifully you explain concepts, with immaculate animations and then literally make such complex tasks seem so easy! Gonna make my kids watch your tutorials instead of cartoons one day ♥️😂
@ramyosama8088
@ramyosama8088 4 года назад
Please continue with this playlist This is hands down the best series on youtube right now !!!
@sentdex
@sentdex 4 года назад
No plans to stop any time soon!
@knowit3887
@knowit3887 4 года назад
U r just ... God for teaching programming... I am glad to have u as a teacher... 💪
@aoof6742
@aoof6742 4 года назад
I really appreciate you doing this mate, I really wanted to learn Neural Networks and you are explaining this soo good.
@sentdex
@sentdex 4 года назад
Glad to hear it!
@prathamkirank
@prathamkirank 4 года назад
This is the online classes we all deserve
@theoutlet9300
@theoutlet9300 4 года назад
better than most ivy league schools
@Gorlung
@Gorlung 3 года назад
this is actually the first NN tutorial during which I haven't felt asleep.. ps. thank you for explaining some of the things twice!
@kaustubhkulkarni
@kaustubhkulkarni 4 года назад
I’d kind of given up on understanding ML and NN. Then I saw Neural Networks from scratch and Sentdex CANNOT make this easier. Loving this series.
@7Trident3
@7Trident3 4 года назад
I banged my head on numerous videos too. They assume a level of knowledge that was hard to peice together. This series is filling lots of gaps for me. The concepts are starting to jell, this whole field is fascinating!! Kind of empowering.
@asdfasdfasdf383
@asdfasdfasdf383 4 года назад
You have created one of the best series on this topic I have found on the internet. Explanations include everything, yet you still proceed at a fast steady pace.
@ambarishkapil8004
@ambarishkapil8004 4 года назад
I know I have said this before, but I am going to say it again, and keep on saying it till you continue to make such awesome tutorials. Thank you!
@prathamprasoon2535
@prathamprasoon2535 4 года назад
This is awesome! Finally, a series on neural nets I can understand easily.
@ginowadakekalam
@ginowadakekalam 4 года назад
This channel is so good that you'll never find any negative comments
@sentdex
@sentdex 4 года назад
They are there sometimes :) but yes fairly rare.
@usejasiri
@usejasiri 4 года назад
-comment
@bas_kar_na_yar
@bas_kar_na_yar 4 года назад
I wish anyone had ever taught me any concept the way you do..
@chaosmaker781
@chaosmaker781 2 года назад
this is better explained and with more quality than any neural network video where the concept is mostly shown just by the code
@codiersklave
@codiersklave Год назад
Still one of the best series on RU-vid to learn the basics of neural networks... fast!
@rrshier
@rrshier 3 года назад
At about 14:51, where you present the matrix multiplied by the vector, the proper mathematical notation would be to have the vector as a column vector, as well as the output vector being a column vector. This is truly how the matrix multiplication is able to work, because a vector is truly just a matrix where one of the dimensions is equal to 1. Other than that, I have to admit, these are my FAVORITE AI/ML videos yet!!!
@pensivist
@pensivist 9 месяцев назад
I was looking for this comment. Thanks for pointing that out!
@kenbinner
@kenbinner 4 года назад
I'm really glad you took the time to break down this concept step by step, will surely reduce the number of headaches in the future! Thank you for your great content looking forward to the next one. 😄
@Alex-ol9dk
@Alex-ol9dk Год назад
I have never bought a book from RU-vid before but you will be the first. You’ve deserved it. Absolutely love this work. Please keep it up
@aav56
@aav56 2 года назад
I've never learned linear algebra and I'm astounded how simple you made matrix multiplication out to be!
@keshavtulsyan7515
@keshavtulsyan7515 4 года назад
Feels like learning all the day, it never felt so simple before...thanks a lot 🙏🏻
@chaks2432
@chaks2432 3 года назад
This is my first time learning about Neural Networks, and you're doing a great job at explaining things in an easy to understand way.
@carlossegura403
@carlossegura403 3 года назад
Back when I was learning the concepts behind building a network, most tutorials went straight into the maths, while that is fine - what I wanted to understand was the different compositions from the input to the output. This video was what I was looking for back then before going deep into the theory and methodology. Great content!
@jonathantribble7013
@jonathantribble7013 4 года назад
Friend: "So what do you do in your free, unwind, leisure time?" Me: "Neural Networks From Scratch" Friend: "..."
@alexgulewich9670
@alexgulewich9670 3 года назад
Sister: "If that's informative, then what's educational" Me: "Glad you asked!" *starts to explain neural networks and basic QP* Sister: "NO! Make it stop!" *Never asks again*
@stevenrogersfineart4224
@stevenrogersfineart4224 3 года назад
Story of my life 😁
@Alfosan2010
@Alfosan2010 4 года назад
last time I was this early, Corona was just a beer brand...
@littlethings-io
@littlethings-io 3 года назад
Just ordered the book - can't wait to dive into it. Thanks you, this is good stuff and a priceless contribution to the evolution of this area of science.
@jsnadrian
@jsnadrian 4 года назад
i can't believe you created this course - absolutely fantastic and wonderfully thoughtful in its layout - thanks so much
@yabdelm
@yabdelm 4 года назад
This is the best series by far I've ever seen. Just what I was looking for. I wonder if you'll get into explaining the why also. For instance, often times when I'm watching I do wonder "Why do we even have biases? What function do they serve? How do they enhance predictions? What sort of history/science/neuroscience underlies that and where do AI and neuroscience partways if so? Why does all of this work at all?"
@asongoneal28
@asongoneal28 4 года назад
Youssef I really hope @sentdex reads this ;)
@naseemsha3010
@naseemsha3010 4 года назад
I think it was explained in a previous video, how biases help making predictions. Check out the last video guys
@carloslopez7204
@carloslopez7204 4 года назад
He explained that in previous videos, but no all your questions
@yabdelm
@yabdelm 4 года назад
@@carloslopez7204 I agree it was explained a bit but I really didn't feel the explanation gave me a deep understanding of the why unfortunately, just a very rough surface level and vague hint of what might be going on.
@liadinon1134
@liadinon1134 4 года назад
I think that now some things, like the biases dont make sense now. But when you get into training(the lerning process) is all start to make sense.
@lemoi6462
@lemoi6462 4 года назад
The interesting part will be the backward propagation, im really looking forward to this
@hasneetsingh
@hasneetsingh 3 года назад
Your explanations are so clear, I really appreciate the hard work you've been through to design this series to make such complex topics so much fun to learn :) . Enjoying a lot
@devinvenable4587
@devinvenable4587 9 месяцев назад
I watching this as a refresher as I studied this topic a few years ago, and I find the context you provide really useful. Thanks!
@classicneupane6196
@classicneupane6196 4 года назад
Understood batch size finally
@sentdex
@sentdex 4 года назад
Glad we could help!
@JackSanRio
@JackSanRio 4 года назад
I pre-ordered the book because this is interesting and I a eager to learn more
@peppep4426
@peppep4426 4 года назад
This reminds me of the best TV series ... You finish one episode and look forward to the next ... Good job!
@bradley1995
@bradley1995 10 месяцев назад
I just want to again say thank you so much for these videos. They are top notch. It truly has helped me get a deep understanding compared to what many other "tutorials" have. Plus all this information being provided free. I feel blessed!
@shubhamdamani1057
@shubhamdamani1057 4 года назад
Can you please provide a visual representation of how the batches pass along. I mean by using animation using bubbles and lines like you did in the initial videos.
@patrickvieira9200
@patrickvieira9200 4 года назад
well finally looks like my linear algebra class was not a waste of time at all
@clementsiow176
@clementsiow176 4 года назад
Never have I been so exciting for a new RU-vid video, you have earned my respect
@yuwankumar
@yuwankumar Год назад
After many searches I found this playlist! Thank you for making this Gold.
@harikalatheeswaran9206
@harikalatheeswaran9206 4 года назад
For people watching this video... remember this golden rule : Say we have two Matrices A&B..in order to multiply A with B,i.e A.B The number of Columns of Matrix A should be equal to number of Rows of Matrix B. That's why A.B != B.A Amazing video 👍! Thanks a lot ! Keep up the amazing work !
@thenotsogentlecat5847
@thenotsogentlecat5847 3 года назад
Sentdex: we're arriving at the sexy parts... Python: Oh, yes I am ;)
@kelpdock8913
@kelpdock8913 3 года назад
x = we're arriving at the sexy parts... print(x)
@tymion2470
@tymion2470 24 дня назад
I'm very thankful for this series, I just learn so much new thing, because you're so good in explaining, and there yet 5 videos to watch!
@lonnie776
@lonnie776 4 года назад
You are doing a great job explaining these concepts in a way that is easy to understand. I can't wait for the next part so I am ordering the ebook. Great job.
@DRIP-DRIP-DRIP
@DRIP-DRIP-DRIP 4 года назад
Never clicked on a video so quickly
@franky0226
@franky0226 4 года назад
Notification => nnfs P4. Me: clicks on the button faster than the speed of light
@gurns681
@gurns681 2 года назад
Mate, this series is unreal! Love your work
@TNTeon
@TNTeon 8 месяцев назад
Hey just to let you know, this video 3 years later continues to help and encourage new programmers! I'm in my freshmen year of highschool doing all gen ed courses, but I started working on this tutorial in my free time and I'm having a blast and actually understanding everything perfectly Just wanted to say thank you so much for really helping people like me in our learning of Computer Science and machine learning! These are awesome and super enjoyable!
@dippy9119
@dippy9119 3 года назад
6:09 what's a fitment line? Google isn't helping me.
@amogh3275
@amogh3275 4 года назад
16:19 you said the other way around by mistake.. shouldn't it be 2.8 +2, 6.8+2, -0.59+2..
@fl7977
@fl7977 4 года назад
Yeah, that really confuse me more than it should have
@bipanbhatta2736
@bipanbhatta2736 4 года назад
yes. It is called broadcasting.
@3alabo
@3alabo 4 года назад
One of the best tutorials I have seen on the topic , Saludos de Argentina!
@merth17
@merth17 4 года назад
I can’t wait to see the implementation of backpropagation with the chain rule, it’s so simple when you teach it. Tysm
@time2learn123
@time2learn123 Год назад
Why does the dot product switch inputs and weights when working with batches. e.g when input is a 1D array the calculation in the code is np.dot(weights, inputs) but for batch it is np.dot(inputs, transposed_weights). Why doesnt it work when we transpose the inputs instead? Im sure Im missing something simple. Thanks for the videos they are amazing!
@joelgerlach9406
@joelgerlach9406 Год назад
Because matrix multiplication is not commutative
@MrGeordiejon
@MrGeordiejon Год назад
I think it is the nature of what we are doing - We are taking inputs - applying weights and biases - and delivering outputs. or entering exiting a decision. so we can't use an entrance to a neuron to exit another neuron. I think the demonstrations by Harrison are to cement the concept and awareness of ValueError shape()... and he also showed how multiplication works between array and vector (multiplication) I went back to lesson 3 for 2 things. 1. I like inputs being the first entry so 'My doors' are labelled correctly 2. Use the npArray().T in that example If he had not shown us 3 before 4 - I would have found it harder to appreciate transpose() - I don't think I will ever just reverse the args when I am coding this stuff. import this
@413blaze
@413blaze Год назад
Something that I find interesting that I think might have to do with this is that specifically in the case of 1 dimensional arrays, the shape is different. I am used to thinking of a matrix as rows by columns. For example, [[2,2],[3,3]] , would be a 2 by 2 matrix. 2 rows and 2 columns; however, lets take the example [1,2,3,4] , I would have expected the shape of this to be 1 by 4 ( 1 row and 4 columns) but it is not. The shape of [1,2,3,4] is (4, 1) . So, the way to think about it is by elements in a list of lists. The first entry x in shape (x , y) is how many lists in a list and the second entry y is how many entries within each element or list. In his example, his first inputs [1,2,3,4] the shape is (4,1) and when he put [[1,2,3,4],[1,2,3,4],[1,2,3,4]] the shape became (3,4) and if you are thinking about this in rows and columns that wouldn't be the case. I hope that made some modicum of sense lol
@ollie6989
@ollie6989 Год назад
You could perform the same operation by transposing the inputs however keep in mind the matrix rule (A.B)' == B'.A' , e.g. inputs.(weights.T) == (weights.(inputs.T)).T aka. the output of inputs.weights_transposed will be equal to the transposed output of weights.inputs_transposed, the issue with the values probably comes from adding the biases without first either transposing them or transposing this output matrix back, as they will be added in a completely different order.
@ahmedyamany5065
@ahmedyamany5065 2 года назад
Great explanation and animation, but in 14:47 [1,2,3,2.5] in python is array which is vector or matrix (4,1) but when you write it in paper or animation you should write in vertical form like column, not row, because [1 2 3 2.5] in animation is matrix (1,4), not (4,1), so we can say every element in array [1,2,3,2.5] is row, 1 is 1st row, 2,5 is 4th row.
@Voyagedudimanche
@Voyagedudimanche 4 года назад
Hello! I'am following you for more then 2 years and this is the best course for me! With those explanation of math - it is realy cool. Thank you for this work :)
@aamirkhanmaarofi9705
@aamirkhanmaarofi9705 3 года назад
Watching this playlist is awesome, it made my task very easy. Have been stuck with the implementation of the multilayer perceptron for two days. Thanks
4 года назад
0th! Finally!
@thewild2334
@thewild2334 4 года назад
daniel!
@dunder9444
@dunder9444 4 года назад
So op
@user-ns8dl3vm5z
@user-ns8dl3vm5z 4 года назад
U R AWESOME
@afafssaf925
@afafssaf925 4 года назад
You are wayyyyy more buff than it seems by just your face.
@sentdex
@sentdex 4 года назад
I'll keep that in mind
@NikhilSandella
@NikhilSandella 4 года назад
This is the best channel with the best content, with amazing animation. Clear explanation. I'm in love with this man. :)
@DMBalchemy
@DMBalchemy 4 года назад
Incredible as always. This one struck a few lightbulbs. Thanks again, Eagerly anticipating #5, I'll have to work through the draft to prep
@thomasnevolianis8616
@thomasnevolianis8616 4 года назад
import neural_networks_from_scratch as nnfs from nnfs import moments best_moments = moments(channel='Sentdex') print(best_moments[0]) ''The SEXY part of deep learning''
@sharanbabu2001
@sharanbabu2001 4 года назад
Loving the effectiveness! The batch size explanation was amazing!
@sentdex
@sentdex 4 года назад
Glad you liked it!!
@rakshitjoshi823
@rakshitjoshi823 Месяц назад
High quality animations. Much respect!
@garymdmd
@garymdmd Год назад
I am on lesson 4 now - you are such a great instructor, I love learning this stuff.
@saisiddhanthgujjari8954
@saisiddhanthgujjari8954 4 года назад
Amazing content sentdex, the visualizations are just top notch and aid to a much clearer explanation.
@FagunRaithatha
@FagunRaithatha Год назад
This content is really good. Thanks for making this simple. I have been binge-watching your videos.
@frederick3524
@frederick3524 4 года назад
I have been looking forward to this all week!
@asu4908
@asu4908 2 года назад
Doing gods work, ordered the book a while ago and finally have time to actually dive into this now-thank you so much bro
@jedisenpei855
@jedisenpei855 4 года назад
Apart from trying to explain neural networks, you just explained the matrix dot product in the most intuitive way I have ever seen. I know how the dot product works by now, but I also remember how much work I had to give in to understand the concept given lectures and texts i had at university. I had to read through some difficult math equations and really think about what the book was trying to tell me, and I also had to go through a lot exercises to really get a grasp of it and remember it, and then you just explained it in 10 minutes and it makes perfect sense, although I had almost forgottes what it was all about. So easy. I wish my teacher had an animation like the one you show at 9:10. Then I wouldn't have had to struggle through the math classes, as much as I did, in my education as an electrical engineer.
@realbingus
@realbingus 4 года назад
At the point where I had a question, I had not fully watched the video yet. So I commented my question. Literally five seconds later in the video you answer my question in the video. I love the series, thanks for doing this!
@Gazarodd
@Gazarodd 4 года назад
I think this tutorial serie will explode. Atm, it's really clear, you're fantastic
@hemanthkotagiri8865
@hemanthkotagiri8865 4 года назад
I can't be more thankful for anyone than you and Daniel. Thank you so much!
@sentdex
@sentdex 4 года назад
Happy to do it!
@keshan-spec
@keshan-spec 4 года назад
I love this series and i always look forward for the next one. Thank you ❤
@ryangao3564
@ryangao3564 4 года назад
Hey sentdex, such addictive content in your videos. Couldnt wait for the next release any longer. So I just pre-ordered the e-book.
@sentdex
@sentdex 4 года назад
Woo! Hope you enjoy!
@Mayank25
@Mayank25 3 года назад
This is the best tutorial ever I watched.. Kudos 👍🙌🙌🙌
@chuckf5540
@chuckf5540 3 года назад
Great explanation and very clear. I look forward to all videos. What a learning process!!
@horseman3253
@horseman3253 3 года назад
Wooow, this how all subjects in school should be explained, amazing visualization, very clear!
@dinarakhaydarova4898
@dinarakhaydarova4898 Год назад
I thought I understood all of these concepts until I watched your tutorials. it's amazing!
@johnyeap7133
@johnyeap7133 Год назад
Made the batch learning benefits really clear, thank you
@fanasisangweni8539
@fanasisangweni8539 4 года назад
Im glad I found your channel man, i swear to god, your videos are awesome, Im only starting to understand ANNs after watching your videos.
@violinplayer7201
@violinplayer7201 4 года назад
Best python neural networks video, for sure
@super7ace
@super7ace Год назад
God level series on Neural Network. Good job and always proud of you buddy!!
@andrescontrol2866
@andrescontrol2866 3 года назад
Very useful video and very well explained through the series. Thanks a lot Harry!
@brendensong8000
@brendensong8000 3 года назад
Thank you for the clear explanation! I was completely lost after several videos! you made it so clear!
@sciWithSaj
@sciWithSaj 3 года назад
Thanks a lot This will be my first object oriented programming. It was kind of daunting for me, but you made it so simple.
@josephyu2110
@josephyu2110 6 месяцев назад
Wow your video are just amazing, this clarity to explain complex thing is just incredible
@minhtuecung5418
@minhtuecung5418 3 года назад
Now that´s what I call real teaching: triggering curiosity ! Thank you so much, sentdex! Math rules!
@accounttwo5114
@accounttwo5114 4 года назад
Fantastic, I'm really excited about the following videos!
@anilsarode6164
@anilsarode6164 3 года назад
I think the single array of biases at 16:16 get added to the individual row of the dot product matrix is due to the Python broadcasting. Thanks a lot for this video series.
@bartosz13
@bartosz13 11 месяцев назад
This is nuts. Crazy good quality
@benjaminsteakley
@benjaminsteakley Год назад
Took my five years to find something like your videos in 2022. I dropped put of college from stress and i can finally sit down and try to understand this math. I hope the video which explains linear regression is as good as these four so far
@danielbardsen4101
@danielbardsen4101 4 года назад
Hi Sentex, Thanks for doing my engineering/programming career so much more interesting! You really are the best
@HJ-jr7zd
@HJ-jr7zd 4 года назад
Great video Sentdex. Looking forward to read the when it's out.
@unionid3867
@unionid3867 2 года назад
jujur saya hampir putus asa mencari tutorial membuat neural network untuk pemula, beruntung saya menemukan video anda, terimakasih banyak
@tuhinmukherjee8141
@tuhinmukherjee8141 4 года назад
This series is totally amazing! Thanks man
@raccoon_05
@raccoon_05 Год назад
Thx so much for this series. You're really helping me understand the basic concepts behind this 👍👍👍