Тёмный

Introduction to Coding Neural Networks with PyTorch and Lightning 

StatQuest with Josh Starmer
Подписаться 1,2 млн
Просмотров 60 тыс.
50% 1

Although we've seen how to code a simple neural network with PyTorch, we can make our lives a lot easier if we add Lightning to the mix. It makes writing the code easier, makes it portable to different computing environments and can even find the learning rate for us! TRIPLE BAM!!!!
NOTE: You can download the code here: lightning.ai/lightning-ai/stu...
Spanish
Este video ha sido doblado al español con voz artificial con aloud.area120.google.com para aumentar la accesibilidad. Puede cambiar el idioma de la pista de audio en el menú Configuración.
Portuguese
Este vídeo foi dublado para o português usando uma voz artificial via aloud.area120.google.com para melhorar sua acessibilidade. Você pode alterar o idioma do áudio no menu Configurações.
For a complete index of all the StatQuest videos, check out...
app.learney.me/maps/StatQuest
...or...
statquest.org/video-index/
If you'd like to support StatQuest, please consider...
Patreon: / statquest
...or...
RU-vid Membership: / @statquest
...a cool StatQuest t-shirt or sweatshirt:
shop.spreadshirt.com/statques...
...buying one or two of my songs (or go large and get a whole album!)
joshuastarmer.bandcamp.com/
...or just donating to StatQuest!
www.paypal.me/statquest
Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
/ joshuastarmer
0:00 Awesome song and introduction
1:04 Review of basic PyTorch
2:34 Coding a pretrained neural network with PyTorch + Lightning
7:52 Training a neural network with PyTorch + Lightning
14:05 Using Lightning to find a good Learning Rate
17:25 Taking advantage of GPU acceleration with Lightning
#StatQuest #DubbedWithAloud #PyTorch #Lightning

Опубликовано:

 

26 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 196   
@statquest
@statquest Год назад
NOTE: Lightning 2.0 changed the way the learning rate tuner is accessed. This has been updated in the jupyter notebook that you can download here: lightning.ai/lightning-ai/studios/statquest-introduction-to-neural-networks-with-pytorch-lightning To learn more about Lightning: lightning.ai/ Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
@reasonerenlightened2456
@reasonerenlightened2456 Год назад
ANN are about 60 years old but only when they became profitable we see stuff like this video. NO WONDER WE STILL DO NOT HAVE GENARAL AI. there is just not enough profit in it yet.
@andrea-mj9ce
@andrea-mj9ce Год назад
I don't see any code in the second link.
@statquest
@statquest Год назад
@@andrea-mj9ce Fill out the form and you'll get it in an email.
@xinfangmin3665
@xinfangmin3665 Год назад
You are such an amazing guy! Thanks a lot! Love your video~
@statquest
@statquest Год назад
@@salilgupta9427 Thanks for the heads up! It should be working now. :)
@airpeguiV2
@airpeguiV2 Год назад
Hi Josh, I am trying to get my PhD in EEE and as such, my background is in electrical and electronic engineering, not in machine learning or data science. Somehow, in my PhD I've ended up doing more ML and DS than EEE, and I can only infinitely thank you for these resources that you post on the internet for free, for the effort you put on them, and for your dedication. You are a marvel and you have helped me understand and apply concepts and models to my research, which hopefully will one day help society and the environment through a more efficient power grid, capable of accommodating more renewable energy sources and electrical machines. Thank you ∞!
@statquest
@statquest Год назад
Hooray!!! Thank you so much. I'm so glad to hear that my videos are helping you out. :)
@anthonyashwin3457
@anthonyashwin3457 Год назад
Triple Bam 💥
@rickymort135
@rickymort135 4 месяца назад
Embrace, Extend, Extinguish? That's terrible
@rizkykiky7721
@rizkykiky7721 Год назад
this channel is evolving from statistics and math to coding! didn't expect that and I absolutely love it!
@statquest
@statquest Год назад
BAM! :)
@charlesrios8542
@charlesrios8542 Год назад
Well he’s already covered all the stats in all his years lol this is what’s next
@3wcdev878
@3wcdev878 4 месяца назад
How are you not mainstream? This is best DX I've seen on ML so far... So focused on the important parts that need to be coded, is like the fastapi for deep learning.
@statquest
@statquest 4 месяца назад
Thank you!
@CHERKE_JEMA5575
@CHERKE_JEMA5575 Год назад
On my way to finishing your book...I would definitely recommend it to everyone! Love from Ethiopia, Africa
@statquest
@statquest Год назад
Awesome! Thank you!
@massimoc7494
@massimoc7494 16 дней назад
I thought I had finished watching your videos after I passed my statistics exam, and here we go again!
@statquest
@statquest 16 дней назад
bam! :)
@macknightxu2199
@macknightxu2199 Год назад
This NN sery is tremendously amazing, easy to understand while teaching a lot of concepts, processes. The best thing I find is the rhythm they keeps by using bam, double bam, triple bam, tiny bam because normally, learners will lose their mind when learning for a long time with the puzzles like where I am, what I know, where to go. Good job! BR
@statquest
@statquest Год назад
Thank you! :)
@protovici1476
@protovici1476 Год назад
The comic approach and extremely good content makes this the best Lightning video I've ever scene.
@statquest
@statquest Год назад
Thanks!
@ClaseS-1010
@ClaseS-1010 Год назад
@@statquest BAM!
@Luxcium
@Luxcium Год назад
Wow 😮 I didn't knew I had to watch *The StatQuest Introduction To PyTorch* before I can watch the *Introduction to coding neural networks with PyTorch and Lightning* 🌩️ (it’s something related to the cloud I understand) I am genuinely so happy to learn about that stuff with you Josh I will go watch the other videos first and then I will back propagate to this video...
@statquest
@statquest Год назад
Thanks!
@exxzxxe
@exxzxxe Год назад
Josh, you are a math genius in addition to being an outstanding singer!
@statquest
@statquest Год назад
Thank you so much! :)
@jonahturner2969
@jonahturner2969 Год назад
This video will really blow up in just a few months I think. The newest scene text recognition model I'm trying to implement uses Lightning extensively, more and more people will pick it up soon. Thank you for making such a clear explanation
@statquest
@statquest Год назад
Awesome!!! Thank you ! :)
@brockjohnson312
@brockjohnson312 Год назад
yes jonah
@ruchiraina2215
@ruchiraina2215 Год назад
Finally I understood the basic code structure of Neural Networks Using PyTorch. Thanks for that. There is a request - Would you please create the same model using Tensorflow? That would be very helpful to compare these frameworks.
@statquest
@statquest Год назад
I'll keep that in mind.
@zainabkhan2475
@zainabkhan2475 Год назад
Thanks for all your videos they are all precious
@statquest
@statquest Год назад
Thank you!
@AslEroglu
@AslEroglu Год назад
Love your content, it helps me a lot! Very clear explanations, thank you. If anyone struggling to import lightning package, I wrote "import pytorch_lightning" instead "import lightning" and problem solved.
@statquest
@statquest Год назад
BAM! Thank you! :)
@AslEroglu
@AslEroglu Год назад
​@@statquest Warning from lightning installation page: "pip install pytorch-lightning has been deprecated and will stop being updated June 2023. Use pip install lightning instead." When you use the latter one, you can "import lightning".
@nadavnesher8641
@nadavnesher8641 Год назад
Totally awesome!! Great explanations! I love your channel🚀 Thanks so much for your videos 🦾
@statquest
@statquest Год назад
Thank you!
@fizipcfx
@fizipcfx Год назад
Thank you for this video, i would love to see more videos from pytorch ecosystem
@statquest
@statquest Год назад
More to come!
@ArchithaKishoreSings
@ArchithaKishoreSings Год назад
Love the PyTorch content ❤
@statquest
@statquest Год назад
Thank you! :)
@shaktishivalingam3880
@shaktishivalingam3880 Год назад
You are amazing, Thank you for helping us out with your videos it has helped me a lot
@statquest
@statquest Год назад
Thanks!
@younesselhamzaoui6783
@younesselhamzaoui6783 Год назад
Excelent. Thank you so much!
@statquest
@statquest Год назад
Thanks!
@TechAbabeel
@TechAbabeel Год назад
This Channel is awesome 😎
@statquest
@statquest Год назад
Thank you!
@bhagatpandey369
@bhagatpandey369 Год назад
Thank you so much...!!!
@statquest
@statquest Год назад
You are welcome!
@romanemul1
@romanemul1 Год назад
Thanks for this video.
@statquest
@statquest Год назад
You bet!
@bobotran7792
@bobotran7792 Год назад
Our savior returns
@statquest
@statquest Год назад
bam! :)
@divelix2666
@divelix2666 Год назад
Great video, as always. Thank you, Josh, for your hard work! Btw, while following video instructions I found out some things, that should be clarified: - 3:11 - instead of `lightning` there should be `pytorch_lightning` (I installed it with `conda install pytorch_lightning -c conda-forge`) - 15:10 - after we change lr from 0.1 to 0.00214, we need much more than 34 epochs to get desired -16 (more than 1000 epochs, so I can't understand how this lr can be considered better than initial 0.1)
@statquest
@statquest Год назад
Did you use my code or your own? In the free jupyter notebook, I give instructions on how to install lightning (not pytorch_lightning, which is legacy and could be deprecated soon): "pip install lightning". And I just re-ran my notebook and after changing the learning rate, it converged in 19 epochs.
@divelix2666
@divelix2666 Год назад
​@@statquest My own code (I tried to follow video step by step by myself). You are right, after I reinstalled with pip it works as `import lighting`. Btw, learning rate point is still valid.
@computerconcepts3352
@computerconcepts3352 Год назад
Ooo0Oooo new video! Noice 👍
@statquest
@statquest Год назад
bam! :)
@Celbe
@Celbe 6 месяцев назад
Hi Josh, first of all I would like to express my gratitude for the excellent material you have made available. I have a question: what is the criteria for some classes to have the audio track in another language? I love it when they exist in Portuguese.😊😊😊 Greetings from Brazil!!! 👋
@statquest
@statquest 6 месяцев назад
I'm trying to create Portuguese tracks for all of my neural network videos. It takes a lot of time, but I hope to finish sometime soon.
@ZahidHasan-cc8tf
@ZahidHasan-cc8tf Год назад
Triple Bam!!! Hooray!!
@statquest
@statquest Год назад
:)
@0807tanguy
@0807tanguy 4 месяца назад
Great video Josh, you helped learn pytorch and lightning a LOT :) a note at 16:25: "And then it [the trainer] calls traning step again and repeats for each epoch that we requested" --> didn't you mean each batch for every epoch?
@statquest
@statquest 4 месяца назад
Sure
@clearwavepro100
@clearwavepro100 Год назад
Nice!
@statquest
@statquest Год назад
Thanks!
@AndyMyers
@AndyMyers Год назад
I'm hoping the next in the series shows us an example of optimising all the things and not just the final bias.
@statquest
@statquest Год назад
Yep! That's exactly what we do in the next one.
@karag4487
@karag4487 Год назад
More of these please
@statquest
@statquest Год назад
I'm working on them :)
@James-hb8qu
@James-hb8qu Год назад
Maybe just me, but I found the model wouldn't train fast enough to work so I compared my "type along" code with the code in the repository. The difference was at 10:24 and was fixed when I added the 'times 100' to the input and label tensors, as in inputs = torch.tensor([0., 0.5, 1.] * 100) and labels = torch.tensor([0., 1., 0.] * 100)
@statquest
@statquest Год назад
Nice! The code in the repository should be updated to have the * 100 multiplier. When did you download it?
@James-hb8qu
@James-hb8qu Год назад
@@statquest Ah, my comment was ambiguous. The repository code works. I like to code along as I watch your videos and the video didn't have the *100 so I had the initial problem.
@statquest
@statquest Год назад
@@James-hb8qu Hooray! That makes me feel a little better.
@giligili9923
@giligili9923 Год назад
I got the same problem. Could someone explain why the *100? Also, I can see that we need more epochs to arrive at the optimum value, but each epoch runs much slower than before. Does this is worth in another context or what was the problem?
@statquest
@statquest Год назад
@@giligili9923 Are you using my notebook or your own typed in code? The *100 tricks the NN into thinking it has more data than it really has, and as a result, runs smoother.
@Kevin-to7uy
@Kevin-to7uy Год назад
Will you be doing an introduction video on coding Neural Networks in R? Thanks for your videos!
@statquest
@statquest Год назад
I don't have immediate plans for that, but that could change.
@ThinAirElon
@ThinAirElon Год назад
Infinite BAMS !
@statquest
@statquest Год назад
Yes!
@barberaTP
@barberaTP Год назад
As always, great video! So, this also work fine with AMD GPUs and solve the problems like tensorflow only works (great) with Nvidia graphics cards?
@statquest
@statquest Год назад
I'm pretty sure this will work well for any GPUs that PyTorch can work with.
@barberaTP
@barberaTP Год назад
@@statquest ok, I will give a check on the documentation. Thanks 👍
@Bulgolgii
@Bulgolgii Год назад
Hi Josh, will you be doing a walkthrough of how Tab-Net works in the future? Thank you!
@statquest
@statquest Год назад
I keep it in mind.
@ramiwehbi-ni9kw
@ramiwehbi-ni9kw 22 дня назад
Hi Josh Thank you for these learning tutorials. A question, there is no either way to build a more complex model rather doing all the links between the neuron nodes manually?
@statquest
@statquest 22 дня назад
There are lots and lots of easier ways to create neural networks. This was just an introduction. To learn other methods, check out the "coding neural networks" links on this page: statquest.org/video-index/
@ramiwehbi-ni9kw
@ramiwehbi-ni9kw 20 дней назад
​@@statquestthank you
@pappoos2
@pappoos2 Месяц назад
Hi Josh, it took me 3000 epoch to get the final bias value to -16 when using Lightning. Anything to take note of in this case?
@statquest
@statquest Месяц назад
Were you using my code or did you type it in yourself?
@mahammadodj
@mahammadodj Год назад
Thank you very much but how do we initialize those weights and biases?
@statquest
@statquest Год назад
We'll talk about that in future videos.
@kaanzt
@kaanzt 11 месяцев назад
in "lr_find_results" part I have written the same exact code that you wrote but when i write "trainer.tuner.lr_find" it does not recognize tuner when i type "trainer.". And when i run the code after writing the same code, it says Trainer object has no attribute "tuner". I am sure that i have everything updated. I also checked the documentations but couldn't find any solution. Can you help for solving this issue?
@statquest
@statquest 11 месяцев назад
Please download the Jupyter Notebook that is paired with this video. Lightning has updated how this works, so I updated the notebook: lightning.ai/pages/education/introduction-to-coding-neural-networks-with-pytorch-lightning/?
@flippityflop6243
@flippityflop6243 29 дней назад
I'm confused at 14:35. Why does lr_find() create 100 different learning rates if you set the epoch to 34? Is it just convention? If so, how can it test them all within the 34 epochs?
@statquest
@statquest 29 дней назад
The number of learning rates tested (100) is independent of the number of epochs. Each learning rate is tested for a few epochs to see if things are improving and how much etc.
@dimabear
@dimabear 11 месяцев назад
When you calculate the optimal learning rate, what is it that you're actually maximizing/minimizing? I get when you're minimizing loss, you're finding the optimal weights/biases that minimize the loss. But when you try to find the optimal learning rate, what are you maximizing/minimizing and what are you calculating with respect to? For example, the first parameter to lr_find is model, which is BasicLightningTrain(). And BasicLightningTrain() has fixed parameters, as well as the final bias which was changed from -16 to 0. So does this mean that lr_find() used the fixed parameters? If so, I'm assuming if we had set final bias to a random value (instead of 0) we'd arrive at a different optimal learning rate? thanks!
@statquest
@statquest 11 месяцев назад
For each candidate learning rate, we do a few iterations of backpropagation to see which one reduces the loss in a better way.
@shamshersingh9680
@shamshersingh9680 3 месяца назад
Hi Josh, can you please please please make a video on Autoencoders and Variational Autoencoders. Specially from the anomally detection perspective. I have searched youtube and other channels enough but could not find an explanation at par with you. I am pretty sure you must be having really busy schedule. But if you can find time and make a video, I will be honestly obliged to you.
@statquest
@statquest 3 месяца назад
I'll keep that in mind, but I can't promise anything in the near future.
@SaschaRobitzki
@SaschaRobitzki 6 месяцев назад
What's the new way of doing seed_everything(seed=42)? The old way throws the error ImportError: cannot import name 'seed_everything' from 'pytorch_lightning.utilities.seed'.
@SaschaRobitzki
@SaschaRobitzki 6 месяцев назад
Maybe it's not needed anymore, the lightning documentation for the latest version (2.1.3) recommends to use just torch.manual_seed(42).
@zhancao7909
@zhancao7909 9 месяцев назад
I tried to change another parameter to require training self.w00 = nn.Parameter(torch.tensor(1), requires_grad=True) The result was not correct, even final_bias is not close to -16 now after traning. What did I do wrong?
@statquest
@statquest 9 месяцев назад
In this example, you can only optimize the final bias.
@adh921ify
@adh921ify 6 месяцев назад
how do I optimize multiple parameters it does not seem to work if I just set another parameter to "requires_grad= True" is there something else I am missing???
@statquest
@statquest 6 месяцев назад
This model is so simple it's actually very difficult to train. So, once we get to more complicated models it will be easier to train more parameters.
@dylanlebrun-laurent668
@dylanlebrun-laurent668 Год назад
Hi josh, i'm currently following the video and unfortunately the code at : "14:05 Using Lightning to find a good Learning Rate" is no longer good for the job lightning says a new version of if has been released... And so making this all part of the lesson reeeeeeeaaaally hard to proceed. Can you please help on this one please? thanks a lot for what you're doing with your channel it's awesome BTW helped me a lot more than what you could ever imagine !
@statquest
@statquest Год назад
I've updated the jupyter notebook. Just download the latest version and you should be good to go: lightning.ai/pages/education/introduction-to-coding-neural-networks-with-pytorch-lightning/?
@dylanlebrun-laurent668
@dylanlebrun-laurent668 Год назад
@@statquest thanks a lot ! You're a life saver, and quick responding too ! Thanks again ! Keep up the good work
@mhalton
@mhalton 11 дней назад
16:21 I don't see the 'batch_idx' parameter/argument of the 'training_step' function being used at all within the function.
@statquest
@statquest 11 дней назад
true. We're not using it.
@imtim1243
@imtim1243 5 месяцев назад
Thanks Josh for the wonderful content! btw does anyone get tensor(-2.2926) as the final result for the final bias instead of -16? I did follow along the code...
@statquest
@statquest 5 месяцев назад
Are you using my code or did you type it in yourself? Mine is here: lightning.ai/lightning-ai/studios/statquest-introduction-to-neural-networks-with-pytorch-lightning
@imtim1243
@imtim1243 5 месяцев назад
I typed it myself, but let me try yours, thank you so much!@@statquest
@Erezavnil
@Erezavnil 4 месяца назад
When creating the sample data, multiply by 100 : torch.tensor([0.0 , 0.5, 1.0]*100)
@heeheehaha45
@heeheehaha45 8 месяцев назад
Dear Josh, Thankyou for your amazing video. I have a observation from the code: inside class BasicNN_train(nn.Module), I changed the requires_grad parameter of w11 into True #self.w11 = nn.Parameter(torch.tensor(2.7), requires_grad=False) self.w11 = nn.Parameter(torch.tensor(2.7), requires_grad=True) After this change, the graph of training result becomes a flat line and the Total loss becomes 1.0. And it seems that only changing variables other than bias of the NN will have this result. Isn't it strange? Thankyou!
@statquest
@statquest 8 месяцев назад
This is such a simple model with such a small training dataset that, believe it or not, training it was super hard to do. I had to try about a million different starting conditions to get it to converge. Once I did, I wrote down all the optimized weights and biases, but I forgot to keep track of the original, random, starting values. So, unfortunately, I can't recreate the initial conditions that allow all of the weights and bises to be trained.
@katetanjin
@katetanjin Год назад
Hi, Josh, thanks a lot for the super informative video! I tried that with 34 epochs, learning_rate=0.002 will result in final_bias=-2, which is still far from correct, with learning_rate=0.1, 34 epochs gives final_bias 16, that means 0.1 is a better learning rate value than 0.002, but why tuner gives 0.002? A side observation is that the downloaded notebook wrote that dataloader has 3 data points and repeated 100 times, which is different from what's shown in the video -- the dataloader simply has 3 data points in the video.
@statquest
@statquest Год назад
Hmmm... That strange. So, when you ran my code, which sets the learning rate to 0.002 it didn't work?
@katetanjin
@katetanjin Год назад
@@statquest Hi, Josh, thanks for checking out! The code in video didn't work out. More specifically, without *100 in dataloader, 0.002 learning rate and 34 epochs ends up with final_bias = -2 while 0.1 learning rate and 34 epochs ends up with final_bias=-16; with *100 in dataloader, both 0.1 and 0.002 learning rate end up with final_bias=-16. I feel like *100 in dataloader is somewhat cheating, because with 34 epochs, the model trainer in fact saw the data 3400 times? With such large number of iterations, I suspect most learning rate values will end up with final_bias=-16.
@statquest
@statquest Год назад
@@katetanjin I think the learning rate finder needs a lot of data in order to work. That's why we multiplied the data by 100 - to trick it into thinking we had more data than we really had. Anyway, it's just supposed to demonstrate how to use the tool and, in practice, you will probably have more than 3 data points and will not need to trick the learning rate finder.
@terp830
@terp830 Год назад
I have tried to adjust all the parameters of all weights and biases to 0 and changed Gradient Descent to True, but why the results turned out to be 0 for all output while in loop
@statquest
@statquest Год назад
In this case, we have so little data that it's actually quite hard to converge on all of the optimal weights and biases. However, I show how to to optimize everything in this video: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-RHGiXPuo_pI.html
@user-oq1cq3us6w
@user-oq1cq3us6w 10 месяцев назад
Hello Josh! I'm not receiving any emails to download the code! Can you help please? Thank you so much"
@statquest
@statquest 10 месяцев назад
Did you check your "spam" folder? I just tested it myself and the email went to my spam folder, so check there.
@user-oq1cq3us6w
@user-oq1cq3us6w 10 месяцев назад
That's exactly it! Thank you@@statquest
@anitasalamon9958
@anitasalamon9958 Год назад
Your NN videos are a gold mine - watched them all in 2 days. I only wish I had a mentor like you during my PhD journey. Got an idea and would love your insights. Here's the brief: I want to to train the NN based on the fully annotated scRNA-seq dataset (A) with multi-dimensional inputs (genes) and 15 outputs - 15 different cell types/annotations/labels. Then I want to take a new scRNA-seq dataset (B) and use the trained model to annotate it (transfer the labels). Now I want to "improve" this model by adding an additional feature from the dataset B. This feature contains information about the origin of each input- origin X or non-X. I would end up with 30 outputs - 15 different cell types x 2 (origin X or non-X). From here, I would like to take this new "improved" model and use it to annotate scRNA-seq dataset (C). Do you think this is feasible, and do you have any advice on how to "improve" the model? Thanks again for the amazing content!
@statquest
@statquest Год назад
What you want to do sounds reasonable. I did a quick search for "transformer genome annotation" and I found this: www.nature.com/articles/s41467-023-35923-4 which might be interesting to you. My video on Transformers should come out soon, so that might help as well.
@anitasalamon9958
@anitasalamon9958 Год назад
@@statquest would you suggest building the neural network for dataset A? If so which model? and then using transfer learning for dataset B?
@statquest
@statquest Год назад
@@anitasalamon9958 Start by building a transformer model for dataset A
@gauravthakur9386
@gauravthakur9386 10 месяцев назад
Hi Josh, the link to the code in the description isn't opening for me. Is there a workaround?
@statquest
@statquest 10 месяцев назад
Oops! I wonder what happened. I'll look into it.
@statquest
@statquest 10 месяцев назад
OK. Try it again. I think it working now.
@gauravthakur9386
@gauravthakur9386 10 месяцев назад
It works now, thanks!@@statquest
@goncalofernandes1845
@goncalofernandes1845 Год назад
Anyone having problems with the final cells from the code? The lr_find_results.suggestion() does converge to 0.00214, but then the trainer.fit() predicts a final_bias value of -2.1706, I've messed around with the code so it might just be me, still I can't seem to understand what's going on :/ . Anyway great work as usual Josh! Thank you for all the hard work!
@statquest
@statquest Год назад
Try downloading a fresh copy and and then running it without any changes. Does it work or are you running into the same problem?
@junyuzhang4627
@junyuzhang4627 Год назад
I have the same problem with you
@GG-fb1kz
@GG-fb1kz Год назад
Having the same problem, the final bias i got is -2.17, not -16. Appreciate if anyone else can try it and shed some light. Thanks.
@statquest
@statquest Год назад
@@GG-fb1kz I'll take a look and let you know if I update the code.
@statquest
@statquest Год назад
So, I just reran everything and I get -16...so this is a mystery. However, I added a few lines to set the random number generators, so this should take care of any oddities that result from SGD. So, please download the new code here: github.com/StatQuest/pytorch_lightning_tutorials/raw/main/building_nns_with_pytorch_and_lightning_v1.1.zip
@Rhine_e71
@Rhine_e71 4 месяца назад
Sorry, but it seems that the tuner and trainer have been decoupled and the library had a lot of changes. Could you show us the updated code for that? Really appreciate it
@statquest
@statquest 4 месяца назад
One of the benefits of downloading my code, rather than typing it in yourself, is that you get the updates. In case you missed the link for the code, here it is: lightning.ai/lightning-ai/studios/statquest-introduction-to-neural-networks-with-pytorch-lightning?view=public&section=all
@giovannimeono8802
@giovannimeono8802 Год назад
can we get one with keras and tensorflow ?
@statquest
@statquest Год назад
Unfortunately, probably not in the near future. I'm going to go through a whole PyTorch series (from simple to super fancy and in the cloud) first.
@MyChannel-xe5dl
@MyChannel-xe5dl Год назад
I am facing a problem while installing lightning in my conda environment. its taking a lot of time. can you please help me
@statquest
@statquest Год назад
I think you figured out a work around.
@anshvashisht8519
@anshvashisht8519 Год назад
where is the link for notebook for this video?
@statquest
@statquest Год назад
bit.ly/3S9VdLu
@studynotslack
@studynotslack Год назад
Hi josh can u also teach on keras, its a lot more beginner friendly
@statquest
@statquest Год назад
I'll keep that in mind.
@Rahul-oy1vo
@Rahul-oy1vo Год назад
Want more on PyTorch , Josh😭.
@statquest
@statquest Год назад
Working on it! :)
@Rahul-oy1vo
@Rahul-oy1vo Год назад
@@statquest Little bit quicky please Josh, you're the only savior we got.
@RandyKumamoto
@RandyKumamoto Год назад
I think you missed one thing: the pip install pytorch-lightning command :)
@statquest
@statquest Год назад
That's a valid point! To be honest, I've always just assumed people would download the jupyter notebooks, which contain all of the installation instructions, but it's become clear that I should include them in the video as well.
@macknightxu2199
@macknightxu2199 Год назад
Hi, at 16:40, how to set the epoch number? BR
@statquest
@statquest Год назад
I talk about setting epochs and steps in my video on coding Long Short-Term Memory neural networks: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-RHGiXPuo_pI.html
@macknightxu2199
@macknightxu2199 Год назад
@@statquest Looks good; this video is just added to the series. Cheers.
@burakalkan4137
@burakalkan4137 Год назад
I think 34 epochs not enough for learning rate 0.00214, in my tests I had to increase it way more epochs to get final_bias right
@statquest
@statquest Год назад
Did you download my code or did you type in your own? I'll look into this because they've updated Lightning since I created this video.
@burakalkan4137
@burakalkan4137 Год назад
@@statquest I didnt download just typed on my own, I had to give it 1000 epochs to get to the -16 val, what do you think the reason is? ( BTW great fan of your work, keep it up! )
@burakalkan4137
@burakalkan4137 Год назад
OK after setting the max_epochs to 1k I get -15.789505004882812 which is still not there, with 2k it gives out correct value (-16.xx) if I set lr to 0.1 34 epochs are enough, maybe they really changed something.
@statquest
@statquest Год назад
@@burakalkan4137 There could be a lot of things going on. I'll look into it.
@hsstp
@hsstp Год назад
Please provide the link to download the code, Thanks
@statquest
@statquest Год назад
The link to the code is in a pinned comment, but here it is as well: lightning.ai/pages/education/introduction-to-coding-neural-networks-with-pytorch-lightning/?
@hsstp
@hsstp Год назад
@@statquest Thanks very much. Another request, It would be great if you uploaded a video about the diffusion model with code.
@vladmirbc8712
@vladmirbc8712 6 месяцев назад
I've got something strange here :) First of all I've added this in forward method, because I've got an error without it: input_to_final_relu = scaled_top_relu_output + scaled_bottom_relu_output + self.final_bias output = F.relu(input_to_final_relu) return output Then with model = BasicLightningTrain() trainer = L.Trainer(max_epochs=34) tuner = L.pytorch.tuner.Tuner(trainer) I've got model.final_bias.data = -2.1706 then I've changed max_epochs=5000 And only after that I've got correct model.final_bias.data = -16.0098 and I can't figure out why learning rate is the same as yours: "lr_find() suggests 0.00214 for the learning rate" However, an interesting fact is that I'm always suggested the same learning rate, regardless of whether I change the number of epochs or not
@vladmirbc8712
@vladmirbc8712 6 месяцев назад
I've figured it out. I really can't understand how 34 epochs can be sufficient for training with a learning rate (lr) of 0.00214. It seems like you might not have applied model.learning_rate = 0.00214. With model.learning_rate = 0.1, 34 epochs are indeed sufficient, but it seems nearly impossible to find the global optimum with a learning rate of only 0.00214 over 34 epochs
@statquest
@statquest 6 месяцев назад
Are you using my code or did you type it in yourself?
@vladmirbc8712
@vladmirbc8712 6 месяцев назад
@@statquest sorry, you're right! I was typing code according to the video, but now I've checked your notebook and have found some differences with my code (for example, here inputs = torch.tensor([0., 0.5, 1.] * 100)). So, everything is correct, thanks!
@bryanmccormack2836
@bryanmccormack2836 Год назад
Anyone getting the following error: "module 'lightning' has no attribute 'LightningModule'"
@statquest
@statquest Год назад
Which version of lighting are you using? Also, feel free to contact me directly via my website: statquest.org/contact/
@bryanmccormack2836
@bryanmccormack2836 Год назад
@@statquest Pytorch: '1.12.1' and the most recent Lightning: 3.2.0
@statquest
@statquest Год назад
@@bryanmccormack2836 Hmm... The version for Lightning seems a little strange since the latest version is 2022.10.7. Try "pip install lightning --upgrade" to see if you can get the new version.
@fustigate8933
@fustigate8933 Год назад
First
@statquest
@statquest Год назад
BAM! :)
@arhammehmood9963
@arhammehmood9963 3 месяца назад
@statquest the updated code for finding new learning rate is not working
@statquest
@statquest 3 месяца назад
Did you type in your own code or follow the link to my code? lightning.ai/lightning-ai/studios/statquest-introduction-to-neural-networks-with-pytorch-lightning I just re-ran my code and it worked fine.
@itsbxntley2970
@itsbxntley2970 Год назад
trainer = L.Trainer(max_epochs=34, accelerator="auto", devices="auto") 1)## Now let's find the optimal learning rate tuner = L.pytorch.tuner.Tuner(trainer) lr_find_results = tuner.lr_find(model, train_dataloaders=dataloader, # the training data min_lr=0.001, # minimum learning rate max_lr=1.0, # maximum learning rate early_stop_threshold=None) # setting this to "None" tests all 100 candidate rates 2)# lr_find_results = trainer.tuner.lr_find(model, # train_dataloaders=dataloader, # the training data # min_lr=0.001, # minimum learning rate # max_lr=1.0, # maximum learning rate # early_stop_threshold=None) # setting this to "None" tests all 100 candidate rates For some reason finding learning rate with method 2 brings up the error AttributeError: 'Trainer' object has no attribute 'lr_find' ...what could be the issue?...already tried updating lightning
@statquest
@statquest Год назад
So they just released PyTorch 2.0 which reorganized the code. I've updated the notebook to reflect this change. However, updating the video is much harder. I'll put a note in a pinned comment.
Далее
Long Short-Term Memory with PyTorch + Lightning
33:24
The StatQuest Introduction to PyTorch
23:22
Просмотров 149 тыс.
Я обещал подарить ему самокат!
01:00
УРА! Я КУПИЛ МЕЧТУ 😃
00:11
Просмотров 719 тыс.
Liquid Neural Networks
49:30
Просмотров 240 тыс.
The Essential Main Ideas of Neural Networks
18:54
Просмотров 911 тыс.
I Built a Neural Network from Scratch
9:15
Просмотров 198 тыс.
Tensors for Neural Networks, Clearly Explained!!!
9:40
The moment we stopped understanding AI [AlexNet]
17:38
Просмотров 781 тыс.