Тёмный
Graham Neubig
Graham Neubig
Graham Neubig
Подписаться 13 тыс.
Комментарии
@AbidineVall
@AbidineVall 5 дней назад
Hello, when will the new course be available?
@KeChen-Peter
@KeChen-Peter 9 дней назад
Thank you very much!!!!
@PreciousToyo-dw2nj
@PreciousToyo-dw2nj 10 дней назад
I'll give it s try after watching your videos. I even subscribe to your channel I had my doubts about these AI agents but will give openHand(formerly called opendevin) and maybe SWE-Agent from Princeton
@PreciousToyo-dw2nj
@PreciousToyo-dw2nj 10 дней назад
Thanks for this. I've been looking for a video it coding and solving GitHub issues
@AdityaSharma-ld2ed
@AdityaSharma-ld2ed 16 дней назад
Hi Graham, when will the new video of the course come out?
@amitabhachakraborty497
@amitabhachakraborty497 19 дней назад
board is not visible
@harshdeepsingh3872
@harshdeepsingh3872 26 дней назад
Could you kindly upload the recitations as well? It would be greatly beneficial.
@xiebowen9796
@xiebowen9796 26 дней назад
Thanks a lot for making these great materials to be open to public.
@harshdeepsingh3872
@harshdeepsingh3872 2 месяца назад
Thanks Prof .
@atwolin2401
@atwolin2401 2 месяца назад
Thanks for sharing. It helps me a lot!
@mikadu1057
@mikadu1057 3 месяца назад
Hi Professor, Thank you for the insightful lecture. I have one quick question. In the lecture, you mentioned that Llama2 is safer than MoE-based models such as Mixture of Experts (Mixtral). I wonder, with heavy context distillation or other tuning methods that Meta conducted, if these methods were applied to MoE structures, would the MoE-based model, such as Mixtral, gain comparable scores in safety? Or are there fundamental differences resulting from different structures that make dense models inherently better than MoE models in terms of safety?
@420_gunna
@420_gunna 3 месяца назад
Soooo quiet
@420_gunna
@420_gunna 3 месяца назад
I love the lectures by PhD students. This and the Prompting lecture (iirc) were fantastic
@420_gunna
@420_gunna 3 месяца назад
Prompt Tuning explanation @ 58:50 isn't very clear imo
@420_gunna
@420_gunna 3 месяца назад
This was an awesome lecture, thanks! Certainly the best in the course so far.
@benjaminevanoff1856
@benjaminevanoff1856 3 месяца назад
Thank you for sharing this knowledge publicly :)
@shumiking
@shumiking 4 месяца назад
Hello~ How to install opendevin? (which one macbook or windows?) I can't install windows and MacBook. Please explain the installation method in detail.
@Galleons358
@Galleons358 4 месяца назад
A very comprehensive overview of RAG related technologies, thanks!
@yiwang1982
@yiwang1982 4 месяца назад
which llm model and codeagent?
@argh44z
@argh44z 4 месяца назад
thanks for the very useful vids!! i'm wondering where lect 19 is
@whoami6821
@whoami6821 4 месяца назад
Same here😂
@argh44z
@argh44z 4 месяца назад
lol. "OpenAI has been doing sketchy things for a long time and look where they are". ya so true. thanks for the vids, they are amazing content!
@mincoffee1995
@mincoffee1995 4 месяца назад
Thanks professor. Tuition too expensive, lol
@faisalIqbal_AI
@faisalIqbal_AI 4 месяца назад
Thanks
@faisalIqbal_AI
@faisalIqbal_AI 4 месяца назад
Thanks
@anasnouri8944
@anasnouri8944 4 месяца назад
This is great channel for know about NLP and all concepts that have relation with NLP like transformer and etc ....
@jaeboumkim1213
@jaeboumkim1213 4 месяца назад
Thank you for your introducing interesting papers!
@Galleons358
@Galleons358 5 месяцев назад
Thanks professor!This is the latest and most complete series of large-scale model courses that the whole RU-vid can see at present😃😃
@yimingcui946
@yimingcui946 5 месяцев назад
Thank you so much professor! However, could you also release the rest videos and the 16th video? Thanks in advance!
@Galleons358
@Galleons358 5 месяцев назад
great lecture!
@Galleons358
@Galleons358 5 месяцев назад
I've really enjoyed the 14 episodes of this series so far - each one has been incredibly insightful and helpful. I noticed that there haven't been any new uploads recently, and I'm eagerly looking forward to the next episodes. Can you please let us know if you plan to continue the series? Many of us are excited to learn more and can't wait to see what's next. Thanks for all your hard work!
@Galleons358
@Galleons358 5 месяцев назад
Thanks for sharing!
@mahmoud.hegab0
@mahmoud.hegab0 5 месяцев назад
There is no sound?
@BrunsterCoelho
@BrunsterCoelho 5 месяцев назад
Its hard to see the board, what is the version of bilinear attention that is commonly used mentioned at 54:25?
@ziqiyang2558
@ziqiyang2558 6 месяцев назад
Hi Professor Neubig, really like your advanced nlp series, would you upload the latest lecture videos? Thank you.
@pritioli8429
@pritioli8429 6 месяцев назад
Thank you for sharing!
@samson6707
@samson6707 6 месяцев назад
theres a mistake on the slide at 3:58 in the interpolation smoothing. the index should be i-n+2 not 1-n+2.
@SuryaVadivazhagu
@SuryaVadivazhagu 6 месяцев назад
thank you for the fantastic video! great slides and up to date explanation
@mohamadjavadmomeninezhad
@mohamadjavadmomeninezhad 6 месяцев назад
waiting for next sessions ....
@ArwaOmayrah
@ArwaOmayrah 6 месяцев назад
Hi, just a quick question. I noticed that the new playlists e.g. 2024 don't cover old approaches in NLP. For someone who don't have any background in NLP would you advice me to begin with the 2017 playlist or should I begin from 2024 playlist?
@sudhanvasavyasachi2525
@sudhanvasavyasachi2525 Месяц назад
did u get a answer for this one. if yes please do share it as i too am in the same dilamma
@milin_234
@milin_234 7 месяцев назад
Nice Lecture Prof . Thanks for making it available for us.
@kiriTo77419
@kiriTo77419 7 месяцев назад
It's great content. Will you continue uploading subsequent lectures? Thanks!
@gappogappo
@gappogappo 7 месяцев назад
興味深い講義ありがとうございました。 16:15 In Attention is All You Need, I was under the impressions that the number of vectors in K and Q had to be the same in order to do the dot product. Or is permissible for K and Q to be different because we are talking about decoders?
@عبدالرحمنابراهيم-د2ن
@عبدالرحمنابراهيم-د2ن 7 месяцев назад
thank u for give us these lectures u help me alot
@AzOzYnEt
@AzOzYnEt 7 месяцев назад
Can we have the link for Facebook OPT lessons? (Here 1:02:15 )
@ilearnthings123
@ilearnthings123 6 месяцев назад
github.com/facebookresearch/metaseq/blob/main/projects/OPT/chronicles/OPT175B_Logbook.pdf
@ujan754
@ujan754 7 месяцев назад
Thanks a lot for sharing these! Will you consider making the camera screen a bit larger? It's really hard to see what's being written on the blackboard.
@fxu4166
@fxu4166 7 месяцев назад
Thank you. It's really useful
@nottoday2131
@nottoday2131 7 месяцев назад
What a legend
@willcheng8257
@willcheng8257 8 месяцев назад
Thank you so much
@DsnnaveenKumar
@DsnnaveenKumar 8 месяцев назад
is these assessments are free to write and get the certficate
@zTech300
@zTech300 8 месяцев назад
a lot of helpful information out there, thanks for sharing Graham, I really couldn't resists watching all of the video. If you could consider making another lecture about Jailbreaking/Injections it would be nice.