Тёмный

Let's build GPT: from scratch, in code, spelled out. 

Andrej Karpathy
Подписаться 468 тыс.
Просмотров 4,4 млн
50% 1

We build a Generatively Pretrained Transformer (GPT), following the paper "Attention is All You Need" and OpenAI's GPT-2 / GPT-3. We talk about connections to ChatGPT, which has taken the world by storm. We watch GitHub Copilot, itself a GPT, help us write a GPT (meta :D!) . I recommend people watch the earlier makemore videos to get comfortable with the autoregressive language modeling framework and basics of tensors and PyTorch nn, which we take for granted in this video.
Links:
- Google colab for the video: colab.research.google.com/dri...
- GitHub repo for the video: github.com/karpathy/ng-video-...
- Playlist of the whole Zero to Hero series so far: • The spelled-out intro ...
- nanoGPT repo: github.com/karpathy/nanoGPT
- my website: karpathy.ai
- my twitter: / karpathy
- our Discord channel: / discord
Supplementary links:
- Attention is All You Need paper: arxiv.org/abs/1706.03762
- OpenAI GPT-3 paper: arxiv.org/abs/2005.14165
- OpenAI ChatGPT blog post: openai.com/blog/chatgpt/
- The GPU I'm training the model on is from Lambda GPU Cloud, I think the best and easiest way to spin up an on-demand GPU instance in the cloud that you can ssh to: lambdalabs.com . If you prefer to work in notebooks, I think the easiest path today is Google Colab.
Suggested exercises:
- EX1: The n-dimensional tensor mastery challenge: Combine the `Head` and `MultiHeadAttention` into one class that processes all the heads in parallel, treating the heads as another batch dimension (answer is in nanoGPT).
- EX2: Train the GPT on your own dataset of choice! What other data could be fun to blabber on about? (A fun advanced suggestion if you like: train a GPT to do addition of two numbers, i.e. a+b=c. You may find it helpful to predict the digits of c in reverse order, as the typical addition algorithm (that you're hoping it learns) would proceed right to left too. You may want to modify the data loader to simply serve random problems and skip the generation of train.bin, val.bin. You may want to mask out the loss at the input positions of a+b that just specify the problem using y=-1 in the targets (see CrossEntropyLoss ignore_index). Does your Transformer learn to add? Once you have this, swole doge project: build a calculator clone in GPT, for all of +-*/. Not an easy problem. You may need Chain of Thought traces.)
- EX3: Find a dataset that is very large, so large that you can't see a gap between train and val loss. Pretrain the transformer on this data, then initialize with that model and finetune it on tiny shakespeare with a smaller number of steps and lower learning rate. Can you obtain a lower validation loss by the use of pretraining?
- EX4: Read some transformer papers and implement one additional feature or change that people seem to use. Does it improve the performance of your GPT?
Chapters:
00:00:00 intro: ChatGPT, Transformers, nanoGPT, Shakespeare
baseline language modeling, code setup
00:07:52 reading and exploring the data
00:09:28 tokenization, train/val split
00:14:27 data loader: batches of chunks of data
00:22:11 simplest baseline: bigram language model, loss, generation
00:34:53 training the bigram model
00:38:00 port our code to a script
Building the "self-attention"
00:42:13 version 1: averaging past context with for loops, the weakest form of aggregation
00:47:11 the trick in self-attention: matrix multiply as weighted aggregation
00:51:54 version 2: using matrix multiply
00:54:42 version 3: adding softmax
00:58:26 minor code cleanup
01:00:18 positional encoding
01:02:00 THE CRUX OF THE VIDEO: version 4: self-attention
01:11:38 note 1: attention as communication
01:12:46 note 2: attention has no notion of space, operates over sets
01:13:40 note 3: there is no communication across batch dimension
01:14:14 note 4: encoder blocks vs. decoder blocks
01:15:39 note 5: attention vs. self-attention vs. cross-attention
01:16:56 note 6: "scaled" self-attention. why divide by sqrt(head_size)
Building the Transformer
01:19:11 inserting a single self-attention block to our network
01:21:59 multi-headed self-attention
01:24:25 feedforward layers of transformer block
01:26:48 residual connections
01:32:51 layernorm (and its relationship to our previous batchnorm)
01:37:49 scaling up the model! creating a few variables. adding dropout
Notes on Transformer
01:42:39 encoder vs. decoder vs. both (?) Transformers
01:46:22 super quick walkthrough of nanoGPT, batched multi-headed self-attention
01:48:53 back to ChatGPT, GPT-3, pretraining vs. finetuning, RLHF
01:54:32 conclusions
Corrections:
00:57:00 Oops "tokens from the future cannot communicate", not "past". Sorry! :)
01:20:05 Oops I should be using the head_size for the normalization, not C

Наука

Опубликовано:

 

8 июн 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 2,4 тыс.   
@fgfanta
@fgfanta Год назад
Imagine being between your job at Tesla and your job at OpenAI, being a tad bored and, just for fun, dropping on RU-vid the best introduction to deep-learning and NLP from scratch so far, for free. Amazing people do amazing things even for a hobby.
@crimpers5543
@crimpers5543 Год назад
he's probably bored at both of those jobs. once people get to high level director positions, they are far removed from the trenches of code. Lots of computer scientists have passion in actually writing and explaining code, not just managing things.
@aaronhpa
@aaronhpa Год назад
and yet people still say socialism isn't viable when most of the great stuff in the internet was done for free/ without expectation of compensation
@shyvanatop4777
@shyvanatop4777 Год назад
@@aaronhpa free market developed the skills but sure man
@aaronhpa
@aaronhpa Год назад
@@shyvanatop4777 did it? I think hard work and dedication by all this people did and not the ability of selling it.
@jayakrishnankr7501
@jayakrishnankr7501 Год назад
@@aaronhpa , it's all about incentives. Why would you ever do anything if you could get anything without effort? In a fictional utopia, socialism might be viable, but human beings don't work like that. For example, this platform came about because of capitalism. I think achieving a balance between the two would be the best. Like building, this platform came from capitalism content here is socialism, maybe something like that.
@8LFrank
@8LFrank Год назад
Living in a world where a world-class top guy posts a 2-hour video for free on how to make such cutting-edge stuff. I barely started this tutorial but at first I just wanted to say thank you mate!
@FobosLee
@FobosLee Год назад
Wait. It’s him! I didn’t understand at first. Thought it was random IT RU-vidr
@DavitBarbakadze
@DavitBarbakadze Год назад
How did it go?
@Tinjinladakh
@Tinjinladakh Год назад
hey jake, what should i do before learn programming, is all basic language is same or different. should i learn only python?
@ChrisSmith-lk2vq
@ChrisSmith-lk2vq Год назад
Totally agree!!
@atlantic_love
@atlantic_love Год назад
"Cutting edge"? The only cutting will be your job. Think before getting your panties all wet. The only people excited for this crap are investors, employers and failed programmers looking for some sort of edge.
@jamesfraser7394
@jamesfraser7394 Год назад
Wow! I knew nothing and now I am enlightened! I actually understand how this AI/ML model works now. As a near 70 year old that just started playing with Python, I am a living example of how effective this lecture is. My humble thanks to Andrej Karpathy for allowing to see into and understand this emerging new world.
@user-ks8xf3ie3g
@user-ks8xf3ie3g Год назад
Good for you youngster. 75 and will be doing this kind of thing till I drop ... Still run my technology company and doing contract work. Cheers.
@mrcharm767
@mrcharm767 Год назад
what makes u learn these at age of 70?
@jamesfraser7394
@jamesfraser7394 11 месяцев назад
@@mrcharm767 Want to analyze more stocks , the way I would, in a shorter time. ;)
@fawzishafei5565
@fawzishafei5565 11 месяцев назад
@@mrcharm767 The sky is the limit.....!
@fmailscammer
@fmailscammer 11 месяцев назад
I’m always excited to learn new things, hope I’m still learning at 70!
@BAIR68
@BAIR68 6 месяцев назад
I am a college professor and learning GPT from Andrej. Every time I watch this video, I not only I learn the contents, also how to deliver any topic effectively. I would vote him as the "Best AI teacher in RU-vid”. Salute to Andrej for his outstanding lectures.
@noadsensehere9195
@noadsensehere9195 5 месяцев назад
which university?
@bohanwang-nt7qz
@bohanwang-nt7qz 4 месяца назад
Hey, I'd like to introduce you to my AI learning tool, Coursnap, designed for youtube courses! It provides course outlines and shorts, allowing you to grasp the essence of 1-hour in just 5 minutes. Give it a try and supercharge your learning efficiency!
@ocanehauncanedichieilcane
@ocanehauncanedichieilcane 19 дней назад
please don't
@softwaredevelopmentwiththo9648
Thank you for taking the time to create these lectures. I am sure it takes a lot of time and effort to record and cut these. Your effort to level up the the community is greatly appreciated. Thanks Andrej.
@davidananias8239
@davidananias8239 Год назад
Emphasis on appreciation.
@photosone7160
@photosone7160 Год назад
ditto 🙂
@evanacharya4153
@evanacharya4153 Год назад
Thank you
@MarkTimeMiles
@MarkTimeMiles Год назад
🙏 You're 🙏 a 🙏 mensch 🙏 Andrej 🙏💪
@DennisXiloj
@DennisXiloj Год назад
Thank you! for real. You are an awesome person Andrej.
@JainPuneet
@JainPuneet Год назад
Andrej, I cannot comprehend how much effort you have put in making these videos. Humanity is thankful to you for making these publically available and educating us with your wisdom. One thing is to know the stuff and apply it in corp setting and another thing is to use that instead to educate millions for free. This is one of the best kind of charity a CS major can do. Kudos to you and thank you so much for doing this.
@vicyt007
@vicyt007 Год назад
Making this video is super simple for a specialist like him. It’s like creating a Hello World program for a computer scientist.
@JainPuneet
@JainPuneet Год назад
@@vicyt007 I beg to differ. I am from the area and I can imagine how much time he must have spent offline to come up with the right abstraction.
@vicyt007
@vicyt007 Год назад
@@JainPuneet I agree that it took him some time to make this video, but I don’t believe it was a tough task.
@hpmv
@hpmv Год назад
@@vicyt007 People who has expertise in an area aren't always good teachers. Being able to show others how it works in an organized, easy-to-understand manner is very tricky. On the surface it looks easy, but if you try doing a video like this yourself, chances are you'll find it much harder than you think.
@vicyt007
@vicyt007 Год назад
@@hpmv I Know it was not an easy task but at least he knows what he is saying, it’s just a matter of explaining concepts. He was a teacher for a long time, then it’s his job, that he is doing for free here ! But in my opinion, this video did not target people with 0 knowledge in maths / ML / IA / Python, because in this case you must admit that it is quite hard to understand. But it was watched by nearly 2M people. Those people are not skilled correctly to understand. Briefly, I think that this video targeted skilled people but was watched by anybody. Why not ?
@fslurrehman
@fslurrehman Год назад
I knew only python, math and definitions of NN, GA, ML and DNN. In 2 hours, this lecture has not only given me the understanding of GPT model, but also taught me how to read AI papers and turn them into code, how to use pytoch, and tons of AI definitions. This is the best lecture and practical application on AI. Because it not only gives you an idea of DNN, but also give you code directly from research papers and a final product. Looking forward to more lectures like these. Thanks Andrej Karpathy.
@rafaelsouza4575
@rafaelsouza4575 Год назад
I was always scared of Transformer's diagram. Honestly, I never understood how such schema could make sense until this day when Andrej enlightened us with his super teaching power. Thank you so much! Andrej, please save the day again by doing one more class about Stable Diffusion!! Please, you are the best!
@bohanwang-nt7qz
@bohanwang-nt7qz 4 месяца назад
Hey, I'd like to introduce you to my AI learning tool, Coursnap, designed for youtube courses! It provides course outlines and shorts, allowing you to grasp the essence of 1-hour in just 5 minutes. Give it a try and supercharge your learning efficiency!
@antopolskiy
@antopolskiy Год назад
It is difficult to comprehend how lucky we are to have you teaching us. Thank you, Andrej.
@gokublack4832
@gokublack4832 Год назад
Wow! Having the ex-lead of ML at Tesla make tutorials on ML is amazing. Thank you for producing these resources!
@SzTz100
@SzTz100 Год назад
I know, I couldn't believe it.
@VultureGamerPL
@VultureGamerPL Год назад
Can you believe it? God bless this man and I'm not even religious!
@cane870
@cane870 Год назад
@@VultureGamerPL cringe
@lookupverazhou8599
@lookupverazhou8599 Год назад
@@cane870 Cope.
@learnomics
@learnomics Месяц назад
@@VultureGamerPL No only ex-lead of ML at Tesla. He is also cofounder of OpenAi
@aojiao3662
@aojiao3662 6 месяцев назад
Most clear and intuitive and well explained transformer video I've ever seen. Watched it as if it were a tv show and that's how down-to-earth this video is. Shoutout to the man of legend.
@user-co4op9ok4b
@user-co4op9ok4b 10 месяцев назад
I cannot thank you enough for this material. I've been a spoken language technologist for 20 years and this plus your micro-grad and make more videos has given me a graduate level update in less than 10 hours. Astonishingly well-prepared and presented material. Thank you.
@amazedsaint
@amazedsaint Год назад
All other youtube videos: There is this amazing thing called ChatGPT Andrej: Hold my beer 🍺 Seriously - we really appreciate your time and effort to create this Andrej. This will do a lot of good for humanity - by making the core concepts accessible to mere mortals.
@syedshoaibshafi4027
@syedshoaibshafi4027 Год назад
u can do it more easily using lstm
@zuu2051
@zuu2051 Год назад
@@syedshoaibshafi4027 do you really saying that out and loud. dude is still living in 2010 🤣
@kevinremmy5812
@kevinremmy5812 Год назад
lit😅
@redsnflr
@redsnflr Год назад
Mere mortals with at least basic programming and python knowledge, but yes.
@kemalware4912
@kemalware4912 Год назад
🍺
@yusufsalk1136
@yusufsalk1136 Год назад
The best notification ever.
@ninadgandhi9040
@ninadgandhi9040 Год назад
Indeed!
@TTTrouble
@TTTrouble Год назад
Literally took the words out of my mouth. It’s been a while since I’ve instaclicked and watched a 2hr long video. Very much worth it.
@andrewm4894
@andrewm4894 Год назад
Ohhhj sheeeeet, clear my schedule!
@Shaunmcdonogh-shaunsurfing
@Shaunmcdonogh-shaunsurfing Год назад
Absolutely agree
@ChrisSmith-lk2vq
@ChrisSmith-lk2vq Год назад
True!!
@I_am_who_I_am_who_I_am
@I_am_who_I_am_who_I_am 2 месяца назад
I did something like this in 1993. I took a ling text and calculated the probability of one word (i worked with words, not tokens) being after another by parsing the full text. And I successfully created a single layer perceptron parrot which can spew almost meaningful sentences. My professors told me I should not pursue the neural network path because it's practically abandoned. I never trusted them. I'm glad to see neural networks' glorious comeback. Thank you Andrej Karpathy for what you have done for our industry and humanity by popularizing this.
@Grey_197
@Grey_197 11 месяцев назад
Broke my back just to finish this video in single sitting. Its a lot to take at once, i think I'll have to implement it bit by bit in a span of day to actually assimilate everything. I am very happy from the lecture/tutorial, waiting for more. Time and effort in making this video possible is highly admirable and respectable. Thank you Andrej.
@NicholasRenotte
@NicholasRenotte Год назад
This is AMAZING! You're an absolute legend for sharing your knowledge so freely like this Andrej! I'm finally getting some time to get into transformer architectures this is a brilliant deep dive, going to spend the weekend walking through it!! Thank you🙏🏽
@varunahlawat9013
@varunahlawat9013 Год назад
Waiting for your take on this too!
@eliotharreau7627
@eliotharreau7627 Год назад
Hi Nicholas , I dont understand all this code . I just have one question is it working ?? And is it like ChatGPT ? Thnx Bro.
@kyriakospelekanos6355
@kyriakospelekanos6355 Год назад
@@eliotharreau7627 This is a demonstration of HOW chatgpt works
@eliotharreau7627
@eliotharreau7627 Год назад
@@kyriakospelekanos6355 I think it is not only how ChatGPT work. But it s a code hoe can do LIKE ChatGPT. That's why I m surprise !!! Thank you anyway.
@satoshinakamoto5710
@satoshinakamoto5710 Год назад
bro can't wait for your video on this!
@meghanaiitb
@meghanaiitb Год назад
What a feeling ! Just finished sitting on this for the weekend, building along and finally understanding Transformers. More than anything, a sense of fulfilment. Thanks Andrej.
@ShihgianLee
@ShihgianLee Год назад
This lecture answers ALL my questions from the 2017 Attention Is All You Need paper. I am alway curious about the code behind Transformer. This lecture quenched my curiosity with a colab to tinker with. Thank you so much for your effort and time in creating the lecture to spread the knowledge!
@thegrumpydeveloper
@thegrumpydeveloper Год назад
So happy to see Andrej back teaching more. His articles before Tesla were so illuminating and distilled complicated concepts into things we could all learn from. A true art. Amazing to see videos too.
@JoseLopez-ox7sq
@JoseLopez-ox7sq Год назад
This is simply fantastic. I think it would be beneficial for people learning to see the actual process of training, the graphs in W&B and how they can try to train something like this.
@AndrejKarpathy
@AndrejKarpathy Год назад
makes sense, potentially the next video, this one was already getting into 2 hours so I wrapped things up, would rather not go too much over movie length.
@jdejota1029
@jdejota1029 Год назад
@@AndrejKarpathy Please don't bother to be over movie length, I enjoyed every minute of the video. It's the first time I attended a in depth class of what's under the hood of a model.
@nikitaandriievskyi3448
@nikitaandriievskyi3448 Год назад
@@AndrejKarpathy I think people would watch these videos even if they were 10 hours long, so don't worry about making them too long :)
@patpearce8221
@patpearce8221 Год назад
@@AndrejKarpathy don't listen to these sycophants. Size matters.
@Marius12358
@Marius12358 Год назад
I'm enjoying this whole series so much Andrej. They make me understand neural networks much better then anything so far in my Bachelor. As an older student that has a large incentive to be time efficient, this has been a gold send. Thank you so much!! :D
@nazgulizm
@nazgulizm 10 месяцев назад
Thank you for taking the time and effort to share this, Andrej! This is of great help to lift the veil of abstractions that made it all seem inaccessible and opening up that world to ML/AI uninitiated like me. I don’t understand all of it yet but I’m now oriented and you’ve given me a lot of threads I can pull on.
@rangilanaoermajhi1820
@rangilanaoermajhi1820 Год назад
Just gone through all of his videos - MLP, Gradients and of course the backprop :), and finally finishing with the transformer model (decoder part). As we all know Andrej is the hero of deep learning and we are very much blessed to get this much of rich contents for free in RU-vid, also from a teacher like him. Fascinating staff from a fascinating contributor in the field of AI 🙏
@zechordlord
@zechordlord Год назад
Thanks so much for making this! I could grasp about 80% of everything with my programming/little bit of university-level machine learning background, but it does not feel like magic anymore. This format of hands-on coding along with the thought process behind it is way better than reading a paper and trying to piece things together.
@rcuzzy
@rcuzzy Год назад
Andrej, I know there is probably a million other things you could be working on or efforts you could put your mind towards, but seriously thank you for these videos, they are important, they matter, and are providing many of us with a foundation of which to learn, build. and understand A.I. from and how to develop these models further. Thank you again and please keep doing these
@reinhodl7377
@reinhodl7377 Год назад
Seriously, Andrej is just so very kind in his way of explaining things. His shakespeare LSTM article way back ("The Unreasonable Effectiveness of Recurrent Neural Networks") was what got me seriously into ML in the first place. And while i've since (professionally) moved to different development work unrelated to ML/AI, this is the exact kind of thing that hooks me back in. Andrej knows people watching this are not idiots and doesn't treat them as such, but at the same time fully understands how opaque even basic AI concepts can be if all you ever really interact with is pre-trained models. There's tons of value in explaining this stuff in such a practical way.
@IllIl
@IllIl Год назад
Dude, thank you so much for this. It was a seriously awesome dive into the implementation with great explanations along the way. I've read/watched a lot of ML content and this has got to be one of the clearest lectures I've come across - even better than the usual famous online uni lectures. Thank you! (And I'll be rewatching it too! :)
@mmedina
@mmedina Год назад
Just wanted to thank you for your efforts. The video is great! Clear, concise, and very understandable. The way you start from scratch, and little by little start building every block of the paper is just awesome. Thank you very much!
@lkothari
@lkothari Год назад
This was incredible Andrej! Really appreciate how you intersperse teaching a concept with coding and building step-by-step. This is the first of your videos that I have watched and I can't wait to watch all the others.
@muhajerAlSabil1
@muhajerAlSabil1 Год назад
"Andrej , your willingness to share your knowledge and insights on RU-vid is truly inspiring. Your passion for teaching and helping others understand complex concepts is evident in your videos, and it's clear that you have a drive to make a positive impact in the field of AI. Keep up the amazing work, and thank you for making this knowledge accessible to all!" ps this comment was generated using GPT
@ProductivityMo
@ProductivityMo Год назад
Thank you Andrej! I can't imagine the amount of time and effort it took to put this 2 hour video together! Very very educational in breaking down how GPT is constructed. Would love to see a follow-up on tuning the model to answer questions on small scale!
@karanacharya18
@karanacharya18 Год назад
Absolutely amazing lecture. Thank you so much Andrej! I finally understand Attention and Transformers. "Code is the ultimate truth". And the way to set the stage and explain the concepts and the code is brilliant.
@curatorsshelf393
@curatorsshelf393 Год назад
Andrej, Thank you so much for sharing your knowledge and expertise. I've been following your video series and it has been truly amazing. I remember you were saying in one of the interviews that to prepare 1hour video, it takes more than 10hrs. I cannot thank you enough for what you are doing!
@miladaghajohari2308
@miladaghajohari2308 Год назад
These videos are awesome. It has been 3 years that I am doing DL research but the way you explain things is so pleasing that I sit through the whole 2 hours. Kudos to you Andrej.
@artukikemty
@artukikemty Год назад
Amazing, watching these videos I can still believe in human kind, seeing a guy like Andrej sharing his knowledge and his time with the rest of the world is something that we do not see every day. Thanks for posting it!
@jwalk121
@jwalk121 Год назад
He's a very good teacher, but there are still islands
@sampsonleo7475
@sampsonleo7475 26 дней назад
This is truly a step-by-step tutorial for building a Transformer system. So impressive by the way you teach! Very clear and very easy to follow. You are a highly talented educator!
@chung-shienwang6248
@chung-shienwang6248 Год назад
Can't be more grateful. We're literally living in the best of times because of you! Thank you so much
@khalobert1588
@khalobert1588 Год назад
I think this man is a singularity, because the world has not seen such a combination of talent and good character. Thanks mate 🙏
@allnewjient7651
@allnewjient7651 Год назад
slime
@sr3090
@sr3090 Год назад
Thank you Andrej for this wonderful session. I a tech enthusiast and wanted to understand how GPT works and came across your video. I have always found the research papers difficult to comprehend and never understood how they actually get implemented. Your video completely changed that. You are such a good teacher and make things so easy to understand. Your fan club just got a new member!! :)
@scottsun345
@scottsun345 Год назад
Wow, this video and everything it covered are just amazing! There are no other words except, thank you, Andrej, for all the efforts it took to make this! Really look forward to more of your great ideas and contents!
@PrakharSrivastav
@PrakharSrivastav Год назад
Truly phenomenal to live in an age where we can learn all this for free from experts like you. Thank you so much Andrej for your contribution. What a gift you have given.
@rockapedra1130
@rockapedra1130 Год назад
This is fantastic. I am amazed that Andrej takes so much of his time to impart this incredibly valuable knowledge for free to all and sundry. He is not only a top researcher but also a fantastic communicator. We have gotten used to big corporations hoarding knowledge and talent to become exploitative monopolies but every so often, humanity puts forth a gem like Mr. Karpathy to keep us all from going head first into the gutter. Thank you!!!
@armaankhokhar7651
@armaankhokhar7651 Год назад
Your playlist has been instrumental to my learning and incredibly motivating. Please keep posting!
@footfunk510
@footfunk510 Год назад
This was amazing. Thank you, Andrej! I've read about the transformer architecture but watching this code walk-through really helped me understand what this looks like in an applied way. Pulling together code and the paper helped bring the theory and practice together.
@coemgeincraobhach236
@coemgeincraobhach236 Год назад
Day 2 of implementing this down, about one more evening to go I think. Thanks so much for this! I spent so long down the rabbit hole of CNNs that its really refreshing to try a completely different type of model. No way I could have done it without a lecture of this quality! Legend
@rw-kb9qv
@rw-kb9qv Год назад
I think this style of teaching is much better than a lecture with powerpoint and whiteboard. This way you can actually see what the code is doing instead of guessing what all the math symbols mean. So thank you very much for this video!
@13thbiosphere
@13thbiosphere Год назад
By 2030 will be the dominant method of learning..... Varsity more efficient..... Any University failing to embrace this method will crumble
@petervogt8309
@petervogt8309 Год назад
Nothing new in this comment. Just want to say 'thank you!' for this amazing tutorial, ...and all the others! The completeness, the information density and pace, the choice of examples and language.... Everything is *just right* , delivered right from the heart and the mind!! Thank you so much Andrej, for taking your time to educate and inspire all of us.
@clamr6122
@clamr6122 Месяц назад
I've watched a lot of explanations of Transformers and this is easily the best. You are a gifted teacher.
@WannabeALU
@WannabeALU Год назад
I don't have words to describe how grateful I am to you and the work you are doing. Thank you!
@klauszinser
@klauszinser Год назад
The world has got a very good teacher back. Very appreciated.
@RKELERekhaye
@RKELERekhaye Год назад
Fantastic video Andre, your the best and so nice.😊
@pastrop2003
@pastrop2003 Год назад
Thank you, Andrej, this is awesome! This is the best hands-on tutorial on the transformer-based language model I ever came across. It is very gracious of you to share your knowledge and experience.
@nikolaMKD95
@nikolaMKD95 Год назад
Wow. I thought you gonna use the Transformer library but you essentially build the entire transformer architecture from scratch. Well done!!
@gokulakrishnanr8414
@gokulakrishnanr8414 3 месяца назад
Thanks! Yeah, it was a fun challenge building the Transformer from scratch. Glad you're enjoying the video!
@matteofogliata21
@matteofogliata21 8 месяцев назад
I've just started approaching the transformer architecture in the last two days, and I think this is by far the best explanation. It's well thought, giving all the hints, intuitions and demonstrations with simple code. Thank you Andrej!
@lipingxiong1376
@lipingxiong1376 Год назад
Thank you so much for creating such valuable content. A few years ago, I watched your 2016 Stanford computer vision course, which was instrumental in helping me understand backpropagation and other important neural network concepts. Andrew Ng's courses initially led me into the world of machine learning, but I find your videos to be equally educational, focused on fundamental concepts, and presented in a very accessible way. I've also been following your blog and was thrilled to learn about your new RU-vid channel. Your dedication to creating these resources is truly appreciated. Growing up in rural China, I didn't have many opportunities to learn outside of textbooks. But now, thanks to people like you, I find myself swimming in a sea of knowledge. Thank you for making such a significant impact on my learning journey. BTW, I edited this with chatGPT to make me sounds more like a native speaker. :)
@eva__4380
@eva__4380 Год назад
Similar experience here . I too watched Stanfords computer vision and Nlp and a few other courses a while back. I also did lectures of linear algebra,calc, probability stats etc from mit ocw to have a strong grasp of the fundamentals . Without RU-vid it wouldn't be possible for me to have access to such high quality education
@raghulponnusamy9034
@raghulponnusamy9034 8 месяцев назад
can you please share me that link @eva__4380
@bohanwang-nt7qz
@bohanwang-nt7qz 4 месяца назад
Hey, I'd like to introduce you to my AI learning tool, Coursnap, designed for youtube courses! It provides course outlines and shorts, allowing you to grasp the essence of 1-hour in just 5 minutes. Give it a try and supercharge your learning efficiency!
@redfordkobayashi6936
@redfordkobayashi6936 Год назад
You just know someone has a deep grasp on the subject matter when they start dishing out "build X from scratch" on a regular basis. Thank you Karpathy for sharing your knowledge with the world. You are more than amazing.
@michaeldimattia9015
@michaeldimattia9015 5 месяцев назад
MIND = BLOWN! Not only is this incredible content, but the way everything was presented, coded, and explained is so crystal clear, my mind felt comfortable with the complexity. Amazing tutorial, and incredibly inspiring, thanks so much!
@iantaggart3064
@iantaggart3064 5 месяцев назад
The first ten minutes alone taught me more than a quick google search could. You're good at this.
@aureliencobb199
@aureliencobb199 Год назад
Giving us these lectures for free. I do not know how to thank you. Great job explaining to us NN so clearly.
@juxyper
@juxyper Год назад
I have some experience in understanding the maths behind all this stuff but I kind of had problems with advancing to creating and training models, these videos are a godsend. Big thanks
@DJ-lo8qj
@DJ-lo8qj Год назад
The students at Stanford who had Andrej as a professor are incredibly lucky; he’s an excellent teacher, breaking down complex topics with high precision and fluidity.
@Kirby-Bernard
@Kirby-Bernard 3 месяца назад
We are grateful that talented people like you believe in teaching and helping! This is an amazing video. Clear, precise, brings out a tough topic to a layperson! So much to learn on how to make technical videos.
@SchultzC
@SchultzC Год назад
From CS231n and RL Pong to this… there is something special about the way you beak down and explain things. I have benefited immensely from it and I’m obviously not the only one. Thank You!
@ayushsrivastava3879
@ayushsrivastava3879 Год назад
Thank you for taking the time to create these lectures. I'll be the first to buy if you ever want to do a subscription plan. Honestly, I learned so much more from this playlist alone than from any other documentation or blogs combined. Working with NLP is now entirely different for me. I'll work hard to work with you one day.
@RemKim
@RemKim Год назад
I suggest watching this video multiple times in order to understand how transformers work. This is by far the best hands on explanation + example.
@HazemAzim
@HazemAzim Год назад
WoW .. Very comprehensive and smooth . You went through almost every detail in an excellent educational manner . This surely needed a lot of effort . I have seen many videos on transformers some of them are really very good in explaining the concepts and the math behind. but in terms of SW implementation , on how transformers work from a code perspective , this is by far the best I have seen . Thank you
@realaliarain
@realaliarain Год назад
A bundle of thanks for this one. This mean so much for us. The community is thankful to you. Taking time for us to actually record this masterpieces. Thanks Andrej
@tamilselvan9942
@tamilselvan9942 10 месяцев назад
This is "insane amount of knowledge packed in a video of 2 hours". Hats Off Man!!
@fooger
@fooger Год назад
As always fantastic video and sharing... Would be really cool if you would have a part II on this and how we could use PPO/RL to do the fine-tuning part of some basic interactive flow. doesn't have to be like ChatGPT (Q/A). Thank you so much Andrej for such amazing video !
@haleemaramzan
@haleemaramzan Год назад
I built this same thing alongside watching the lecture, and loved it! I'm trying to get better at understanding and coding these concepts, and this was extremely helpful. Thank you so much :)
@JamesBradyGames
@JamesBradyGames Год назад
What a wonderful gift to the world. Amazing tutorial. Again. Thank you!
@AlexanderEgeler
@AlexanderEgeler Год назад
James! So funny to see your comment here :-) Hope all is well ...
@JamesBradyGames
@JamesBradyGames Год назад
@@AlexanderEgeler small world! 🙂
@travelwithoutmoving5422
@travelwithoutmoving5422 Год назад
Thanks so much for your time, your contribution is invaluable and the way you explain things in small steps and great detail is unique, so precious when dealing with such complex topics like neural networks, especially for non mother language english speakers like me. Can't wait for your next vids. Big hug.
@GPTBot1123
@GPTBot1123 Год назад
Ive watched this 3 times and I only understand about 80% of it 😂--a testament to how great Adrej is at explaining these models. I'm not a programmer by trade, so a lot of this is totally foreign to me.
@TheNewton
@TheNewton Месяц назад
Yeah, there some good explanation in this video nd build up but some of it gets really dense really quickly it goes back to feeling like reading an inscrutable math research paper.
@8eck
@8eck Год назад
Reward model and reinforcement learning using that reward model would be super cool to learn. Thank you for the current lecture!
@jcmorlando
@jcmorlando Год назад
Simply amazing, thank you Andrej! Hands down the best resource I've consumed to understand how a Transformer is built, and get understanding of how it technically relates to GPT and ChatGPT. I feel like I'm taking my first step into real cutting-edge ML :)
@gonzalocordova5934
@gonzalocordova5934 Год назад
Without a doubt the best video I've seen on transformers. Simply THANK YOU for your talent and humility teaching random people
@starbuck5043
@starbuck5043 Год назад
We live in a time where we can get free lessons on hot topics from one of the best engineers in the business. This is amazing. Thanks, Andrej !
@MikeCairns1
@MikeCairns1 Год назад
Send the man some ☕
@AIlysAI
@AIlysAI Год назад
I dont usually put a comment for any video, but Andrei is simplifying these concepts so easily to understand, is just shows how great he grasps transformers and 100s of papers he summarized in one video, it comes from years of experience and a beautiful mind!
@jasonrothfuss1631
@jasonrothfuss1631 9 месяцев назад
This video deserves two thumbs up (or more)! I spent a lot of time watching and rewatching parts of this, coding the model "the hard way", and it was totally worth it. Thank you!
@thedark3612
@thedark3612 Год назад
Please keep doing what you are doing! You are an absolute gem of an educator
@aistamp
@aistamp Год назад
Welcome to RU-vid in 2023 where one of the top AI researchers is just casually making videos explaining in detail how to build some of the best ML models. Seriously though, these videos are amazing!
@alexandrechikhaoui659
@alexandrechikhaoui659 Год назад
Amazing content, was in the quest for this ! I'm really grateful for your time and qualifications. Thank you Sir !
@johnini
@johnini Год назад
Sir, you have all our respect! You are a legend!! and anyone that had the chance to share a beer or coffee with you is a really lucky person!! Great video, mega clear, and I hope to see more soon about fine tuning, and further steps of training in the future :)
@deutschWallah
@deutschWallah 4 дня назад
He is by far the best teacher for neural networks and AI, ML in general. I highly appreciate your effort Andrej 🙂
@ArunKumar-iz8bi
@ArunKumar-iz8bi Год назад
Thanks a lot Andrej for making such good videos that explain core concepts of neural nets. It would be really helpful if you could make a tutorial/video on the entire workflow and the structured thought process you would follow to train a neural network end to end( to arrive at the final model to be used for production). I mean given a problem statement, how would you train a neural network to solve it , how do you design the experiments to choose the right set of hyperparameters and so on. A hands on tutorial video which would demonstrate this process would definitely help a lot of practitioners trying to use neural networks to solve interesting problems
@mcnica89
@mcnica89 Год назад
Just finished watching this (at 2x speed). I love how hands on this is...every other tutorial I have seen always has a step where they say "its roughly like this...." but this one really shows you what is actually needed to make it work. Looking forward to trying this on some fun problems!
@chrisw4562
@chrisw4562 3 месяца назад
Thank you so much Andrej for your generosity, spending your valuable time on these lectures. This is absolutely amazing.
@nicorauseo5478
@nicorauseo5478 3 месяца назад
Just finished all the lectures so far from the makemore series to this one and my knowledge grew drastically. I went from a stage of knowing the basics to be really comfortable to look under the hood. Definitely going to use this knowledge now to build useful projects. Thank you so much and i'm exited to keep learning form you. 🔥🔥🔥
@mlock1000
@mlock1000 4 месяца назад
I only just noticed that this is set up in a perfect 2 column layout so a person can have the script/notebook they are working on side by side with yours and not have to jump around at all. And it's clean and clutter free. That is some classy action, my deepest respect and gratitude.
@Milark
@Milark Месяц назад
now that's a level of detail I hadn't noticed
@1gogo76
@1gogo76 11 месяцев назад
Andrej is pure genius wrapped in a humble person 🙌
@ComPuPur
@ComPuPur 13 дней назад
Grateful for the times we are living in and the easy access to information that we can enjoy. Thanks for sharing your knowledge, much appreciated!
@senatorpoopypants7182
@senatorpoopypants7182 Год назад
by far the best video in deep learning application I've come across. For some one brand new in the space I'm shocked as to how much I'm following along with such advanced ideas. Thank you so much for putting this out there. It has been tremendously helpful.
@christianhetling3793
@christianhetling3793 Год назад
Hey Andrej i greatly appreciate you making these videos. Next semester i am taking the course Machine learning for nlp. I think these kinds of implementation videos are incredible for learning a subject deeply
@linkin543210
@linkin543210 Год назад
andrej is single handedly putting the open in openai
@NishankSingla
@NishankSingla Год назад
This is the best educational video on self-attention and transformer. Before this, I struggled to understand the paper diagram and now it's clear like water. Your style of teaching is just amazing. It shows your vast amount of experience in this field and passion for teaching. In fact, it was only your session on computer vision during the Bay Area Deep Learning School at Stanford in 2016 that I was able to understand during that 2-day workshop and motivated me to pursue AI and Deep Learning in my career. Thank you for leaving Tesla to pursue your teaching passion and making these videos.
@iansthings2521
@iansthings2521 Год назад
Brilliant. Great presentation, perfectly explained, with great supporting collateral. 2 hours that I recommend to everyone.
@carykh
@carykh Год назад
Thanks for posting this lesson so freely on the internet, Andrej! Man, all this AI educational content on RU-vid recently makes me want to get back into doing AI experiments
@midnightwa4261
@midnightwa4261 Год назад
Well i have watched all your video's... i think its time for more 😆
@JohnVanderbeck
@JohnVanderbeck Год назад
ChatGPT feels like more than just a large language model to me. It seems to , or at least projects, an understanding of concepts that I wouldn't expect a pure language model to have.
@abhisekpanigrahi-qx3dg
@abhisekpanigrahi-qx3dg Месяц назад
The explanation of such difficult concepts is so simple! You deserve a lot of attention to your channel.
@user-vb2zw3gg2x
@user-vb2zw3gg2x 8 месяцев назад
ty for making such a logically smooth tutorial! it helps to see why we use such structure. it's also cool that you explian almost everything that appears in the model even tho they might have been classic in the field. very nice job bravo
@kelele4266
@kelele4266 Год назад
A follow-up video on the fine-tuning stage will be priceless indeed!! I've heard multiple NLP friends say that the key thing that enabled ChatGPT was the curated dataset internal to OpenAI. Super curious to hear what people think. I'd imagine that it was the dataset + fine-tuning (much more so than pre-training since it's a much smaller model vs. GPT-3; and most models use some kind of Transformer architecture).Thank you so much, Andrej!
@ankile
@ankile Год назад
It would be incredibly cool to see a very simple implementation of the second fine-tuning phase! Good lessons in RL to be had for sure :)
Далее
[1hr Talk] Intro to Large Language Models
59:48
Просмотров 1,9 млн
20 часов ради СТРАДАНИЯ - Ultrakill
26:40
Китайка и Пчелка 10 серия😂😆
00:19
#kikakim
00:11
Просмотров 2,2 млн
Let's build the GPT Tokenizer
2:13:35
Просмотров 506 тыс.
Why Does Diffusion Work Better than Auto-Regression?
20:18
Watching Neural Networks Learn
25:28
Просмотров 1,1 млн
The Most Important Algorithm in Machine Learning
40:08
Просмотров 247 тыс.
Run your own AI (but private)
22:13
Просмотров 1,1 млн
State of GPT | BRK216HFS
42:40
Просмотров 648 тыс.
Очень странные дела PS 4 Pro
1:00
Просмотров 440 тыс.