Тёмный
Soheil Feizi
Soheil Feizi
Soheil Feizi
Подписаться
Комментарии
@ruijiaxu4035
@ruijiaxu4035 9 дней назад
I really enjoy your teaching.
@ruijiaxu4035
@ruijiaxu4035 9 дней назад
thanks a lot for your sharing.❤
@AishKhan-le7xq
@AishKhan-le7xq Месяц назад
What is true distribution?
@AishKhan-le7xq
@AishKhan-le7xq Месяц назад
Thank you, Sir.
@freerockneverdrop1236
@freerockneverdrop1236 2 месяца назад
The formula for the neural network in this video should be a 2 level summation instead of one level.
@KittyCat-lp3zy
@KittyCat-lp3zy 2 месяца назад
یاشا آذربایجان ثروتی ❤
@MonkkSoori
@MonkkSoori 4 месяца назад
At 20:20 why does Phi(Q_i) not cancel out in the numerator and denominator?
@janesun9008
@janesun9008 4 месяца назад
Thank you for sharing this lecture, prof. Great quality and easy to understand!
@NavaAbdolalipour
@NavaAbdolalipour 4 месяца назад
من با شما قلمچی اردبیل بودم،بعد سالها توی کشورهای نزدیک بهم اسمتون رو دیدم،خوشحالم از موفقیت های هم دوره ایی ها
@simaranjbari
@simaranjbari 5 месяцев назад
your explanation was very nice and easy to understand. Thank you!
@fierydino9402
@fierydino9402 5 месяцев назад
Wonderful lecture!! Thank you for sharing
@PradeepKumar-zy6cd
@PradeepKumar-zy6cd 6 месяцев назад
Can you please share the slide
@prabhavkaula9697
@prabhavkaula9697 6 месяцев назад
Thank you for the lecture! ☺️
@ax5344
@ax5344 6 месяцев назад
@1:58:57 You said you will explain different procedures to generate different responses later. I did not find it till you start discussing Step 3. Could you illustrate further?
@ax5344
@ax5344 6 месяцев назад
@2:15:30 Found it. Thanks!
@ax5344
@ax5344 6 месяцев назад
When you upload the video, could you set the speed to 1.5? Right now, I'm setting it to 2X, it is still very very slow.
@sabujchattopadhyay
@sabujchattopadhyay 6 месяцев назад
Can you please share the slides? (2)
@miquelnogueralonso2576
@miquelnogueralonso2576 6 месяцев назад
Can you please share the slides
@bardiasafaei457
@bardiasafaei457 6 месяцев назад
Thank you Soheil for the great content and the clear way of explanation! Could you also share the final written notes of each session for download?
@ai__76
@ai__76 6 месяцев назад
Massage lesson! Tnx
@amiltonwong
@amiltonwong 6 месяцев назад
Thanks a lot for providing such an excellent lecture. Would it be possible to release the notes for study? Thanks~
@parhamsalar3826
@parhamsalar3826 6 месяцев назад
Many thanks for your excellent lectures, particularly those on diffusion models. I do have a few inquiries regarding models of conditional diffusion. Can we think of text vectors as the query (Q) and image vectors as the key (K) and value (V) in cross-attention instead of image vectors as the query (Q)?
@parisaemkani5730
@parisaemkani5730 7 месяцев назад
Hi, could u please introduce a good course in the basics of machine learning and deep learning for beginners?
@Stealph_Delta_3003
@Stealph_Delta_3003 7 месяцев назад
Thanks for sharing.
@sdiabr6792
@sdiabr6792 7 месяцев назад
Real quality content
@mozhganmomtaz8169
@mozhganmomtaz8169 7 месяцев назад
I just want to thank you 🤗
@INSTIG8R
@INSTIG8R 7 месяцев назад
This here is the best video on SWIN transformers
@MrNoipe
@MrNoipe 7 месяцев назад
The handwriting is difficult to read, maybe write slower or with a different brush?
@naeemkhoshnevis
@naeemkhoshnevis 7 месяцев назад
Thanks for uploading these lectures.
@miladkhademinori2709
@miladkhademinori2709 7 месяцев назад
@naeemkhoshnevis
@naeemkhoshnevis 7 месяцев назад
Thanks for uploading the lectures.
@Nerraruzi
@Nerraruzi 7 месяцев назад
Thanks so much for sharing this updated version of the course!!
@shayanmohammadizadeh172
@shayanmohammadizadeh172 7 месяцев назад
It's minute 30 of the video and I have watched +8 ads. Really attention is all we need!
@Umar-Ateeq
@Umar-Ateeq 6 месяцев назад
you can use "adblock for youtube" extension to avoid ads.
@junqi7050
@junqi7050 7 месяцев назад
Thank Soheil for sharing the updated deep learning theory courses. I ever followed Sohail's former lectures in 2020, where I learned the theoretical knowledge of deep learning in terms of representation, generalization, and optimization. I found that Soheil's course schedule this year has substantially changed to state-of-the-art transformer-based technologies, such as large language models, etc. I plan to catch up with Sohail's updated deep learning foundation course this year and really appreciate the new lecture videos.
@mohammadshahbazhussain2029
@mohammadshahbazhussain2029 7 месяцев назад
Thank you for sharing it
@BowenXie-b7b
@BowenXie-b7b 7 месяцев назад
Hi professor, I was also wondering that if you plan to to add some contents related to distanglement learning? like nonlinear ICA which I think is very theoretically interesting and important.
@BowenXie-b7b
@BowenXie-b7b 7 месяцев назад
Sorry, there's a typo. It should be 'disentanglement'.
@BowenXie-b7b
@BowenXie-b7b 7 месяцев назад
Thanks for updating this really amazing course. I've read the syllabus of this semester, and find it is really interesting, especially the generative models and multi-modal models part. Hope to see more latest course videos. Thanks a lot for your effort of sharing the contents of this amazing course.
@hesamce
@hesamce 7 месяцев назад
Thank you for sharing the updated version of the course🙏
@AyushSharma-ie7tj
@AyushSharma-ie7tj Год назад
Really nice lecture with a very even pace. Thank you for sharing.
@StratosFair
@StratosFair Год назад
Great lecture. Thank you for sharing
@hamedgholami261
@hamedgholami261 Год назад
explanation of: "Loss landscapes and optimization in over-parameterized non-linear systems and neural networks"
@mojtabakolahdoozi2418
@mojtabakolahdoozi2418 Год назад
Great lecture on the highly ignored ground! thanks
@quanguyenang1615
@quanguyenang1615 Год назад
Thanks for the great lectures, Prof. Soheil.
@sylus121
@sylus121 Год назад
25:00 (Bookmark)
@Thaumast
@Thaumast Год назад
24:18 The loss function is sometime defined by an L and sometime edfines by the caligraohic L, are they the same? thank you very much !
@bryanbocao4906
@bryanbocao4906 Год назад
42:30 one option could be KL divergence loss?
@mskang009
@mskang009 Год назад
Such a great lecture I've seen in RU-vid related to self-supervised learning. So many thanks!
@sinaasadiyan
@sinaasadiyan Год назад
great explanation, just Subscribed!
@sumitsah6092
@sumitsah6092 Год назад
How can we guranttee that w_t lies within the ball??? Because if that is not the case then we can't apply PL inequality. Please comment.
@zonghua
@zonghua Год назад
The handwriting is unclear.
@박재성학생물리·천문
@박재성학생물리·천문 2 года назад
Amazing video. Thanks!