Тёмный

Neural Network From Scratch In Python 

Dataquest
Подписаться 59 тыс.
Просмотров 35 тыс.
50% 1

We'll learn the theory of neural networks, then use Python and NumPy to implement a complete multi-layer neural network. We'll cover the forward pass, loss functions, the backward pass (backpropagation and gradient descent), and the training loop. At the end, we'll use our neural network to predict the weather.
You can find the text version of this lesson here - github.com/VikParuchuri/zero_...
And the complete lesson list for the zero to gpt series here - github.com/VikParuchuri/zero_...
Chapters
00:00:00 Neural network introduction
00:10:05 Activation functions
00:12:10 Multiple layers
00:15:18 Multiple hidden units
00:23:52 The forward pass
00:32:46 The backward pass
00:48:08 Layer 1 gradients
00:56:24 Network training algorithm
01:00:13 Full network implementation
01:06:44 Training loop
This video is part of our new course, Zero to GPT - a guide to building your own GPT model from scratch. By taking this course, you'll learn deep learning skills from the ground up. Even if you're a complete beginner, you can start with the prerequisites we offer at Dataquest to get you started.
If you're dreaming of building deep learning models, this course is for you.
Best of all, you can access the course for free while it's still in beta!
Sign up today!
bit.ly/4016NfK

Опубликовано:

 

10 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 33   
@vikasparuchuri
@vikasparuchuri Год назад
Hi everyone! The code and explanations behind this video are here - github.com/VikParuchuri/zero_to_gpt/blob/master/explanations/dense.ipynb . You can also find all the lessons in this series here - github.com/VikParuchuri/zero_to_gpt .
@SALESENGLISH2020
@SALESENGLISH2020 Год назад
You are a fantastic teacher! Subscribing. Love your pace and explanation of what and why you are doing something.
@paulohss2
@paulohss2 Год назад
Great content as usual!
@chk2899
@chk2899 Год назад
beautiful video Vik! starting my term project this week and NN are a main method i’ll be using! thank you!
@Dataquestio
@Dataquestio Год назад
Thanks, Cade! Good luck :)
@broncos720z
@broncos720z Год назад
this videos have so much value! Thank you!!
@Dataquestio
@Dataquestio Год назад
Glad you like them!
@cule219
@cule219 4 месяца назад
My dude, I love you! Peace! ❤
@agushendra
@agushendra 7 месяцев назад
Thank you for your thoroughly explanation, I have a question, how to decide which matrix to transpose during backpropagation?
@theparten
@theparten 8 месяцев назад
could you show me the link to get the dataset you used please...
@Irak1995
@Irak1995 6 месяцев назад
Thank you for your video, I believe there is an error 31:25 where you define the MSE function. Shouldn't you be taking the mean of the error squared?
@siljageorge1178
@siljageorge1178 Год назад
Does this code work if there is no hidden layer? Only an input and output layer?
@syedmuqtasidalibscs0434
@syedmuqtasidalibscs0434 Год назад
please i want to download the dataset that you are using in this code. kindly share link of this dataset thanks you
@utti_12c
@utti_12c 5 месяцев назад
Wow thank❤
@guglielmodesantis423
@guglielmodesantis423 5 месяцев назад
How would you calculate the bias?
@user-lw5hb2fi7c
@user-lw5hb2fi7c Год назад
I'm also having the problem with ModuleNotFoundError: No module named 'csv_data' at the very beginning
@user-lv7vu9os7i
@user-lv7vu9os7i 3 месяца назад
Does anyone know why for the mse function he does (actual - predicted) ** 2, but for mse_grad he writes predicted - actual? Wouldn't it matter whether you do (predicted - actual) or (actual - predicted) in the mse_grad function as this will change how you update your parameters?
@nickst2797
@nickst2797 Месяц назад
Thanks! One question, is this video complete?
@saisureshmacharlavasu3116
@saisureshmacharlavasu3116 6 месяцев назад
1:06:17 We should not update the weights until we find the grads for all layers. In your code its a mistake, pls correct it.
@FirstLast-tx7cw
@FirstLast-tx7cw 6 месяцев назад
@41:49 the link to the document pls
@luckyman912
@luckyman912 Год назад
Hi Vik! Please help with football video information. Can you please tell me how to start predicting future matches? I prepared the schedule for next week and added it to the "matches.csv" file In the data, I have moving averages for the following data: "xg", "xga", "gls", "sh", "sot", "g/sh", "g/sot", "dist", "fk", "pk", "pkatt", "npxg" ", "npxg/sh". How can I run prediction now? Thanks in advance for your reply. P.S. I am writing through google translator, I hope you understand what I mean.
@Dataquestio
@Dataquestio Год назад
I think you want help on this video, right? - ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-0irmDBWLrco.html You basically take all of the training data up to the last day (today), then generate a prediction. The prediction will be for the next match. You'll need to do it without backtesting, and without dropping any rows from the end of the training data. I talk a little bit about the steps at the very end of this video - ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-egTylm6C2is.html
@luckyman912
@luckyman912 Год назад
Vik, thanks for the reply. Yes, I watched the NBA video and noticed that at the very end there is information about what I am trying to figure out😊 But I watch the video through a translator and there is a possibility that he translates the speech from the video incorrectly, so this process is not clear to me yet. Is it possible for you to record a short video demonstrating how to do this? And what's the price? Thanks for the info 👍
@abhijeetjha6357
@abhijeetjha6357 3 месяца назад
WTF at 5:09 happened mse has only two arguments how it's taking weight and bias as input?
@Leonhart_93
@Leonhart_93 Месяц назад
If you went as far as to do it from scratch, then you might have just as well done it in anything other than Python. All that Python has in this field is libraries. Because it's not like Python has any advantage when it comes to linear algebra. If anything, it will be slower than most other things.
@trustlesss
@trustlesss Год назад
where bias came from? 11
@emeebritto
@emeebritto 10 месяцев назад
it's like a weight, it's adjusted in training. 11 is just an example (for video)
@steamedtech
@steamedtech 12 дней назад
i love math
@thechoosen4240
@thechoosen4240 9 месяцев назад
Good job bro, JESUS IS COMING BACK VERY SOON; WATCH AND PREPARE
@jackaubrey8614
@jackaubrey8614 Год назад
Many thanks for this very comprehensive course, but I'm having a problem - when I run the program I get the following output: Epoch: 0 Train MSE: nan Valid MSE: nan Epoch: 1 Train MSE: nan Valid MSE: nan Epoch: 2 Train MSE: nan Valid MSE: nan (Truncated example). Printing 'loss' and 'epoch_loss' seperately, the following is output (after approx 4500 lines of numeric output): epoch_loss: 4789.425964748963 loss: [[-69.19245634] [-62.18411289] [ nan] [ nan] [ nan] [-63.21175768] [-63.18508556] [-62.17785529]] epoch_loss: nan loss: [[nan] [nan] Running both my own code from following this video plus your code from Github, same results. Any ideas?
@MrM-br1ke
@MrM-br1ke Год назад
HI! Have you solved it?
@Dataquestio
@Dataquestio Год назад
There can be a lot of potential reasons for nan loss, so it's hard to know for sure. Basically, some value (weight, prediction, gradient) is too large for the numpy data type. Things I would try: - Lower the learning rate - It's possible your system defaults to a float format with a lower range - check the dtype of the numpy arrays, and switch to float64 if the dtype is something else - Are you initializing the weights the same way I am? You could try initializing them to smaller values than I did to see if anything changes. - Make sure you're using mse_grad as the loss, not mse
Далее
Gradient Descent From Scratch In Python
42:39
Просмотров 17 тыс.
ШОКОЛАДКА МИСТЕРА БИСТА
00:44
MIT Introduction to Deep Learning | 6.S191
1:09:58
Просмотров 347 тыс.
The Most Important Algorithm in Machine Learning
40:08
Просмотров 320 тыс.
Autoencoders | Deep Learning Animated
11:41
Просмотров 3,4 тыс.
Watching Neural Networks Learn
25:28
Просмотров 1,2 млн
I Built a Neural Network from Scratch
9:15
Просмотров 167 тыс.
Neural Network from Scratch | Mathematics & Python Code
32:32
This is why Deep Learning is really weird.
2:06:38
Просмотров 366 тыс.
ШОКОЛАДКА МИСТЕРА БИСТА
00:44