Тёмный

NLP: Understanding the N-gram language models 

Machine Learning TV
Подписаться 37 тыс.
Просмотров 117 тыс.
50% 1

Опубликовано:

 

22 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 56   
@茱莉-x2o
@茱莉-x2o Год назад
i took a course previously but didn't understand that, this 10-minute tutorial is very intuitive and everything comes together and starts making sense to me.
@franciscovinueza5320
@franciscovinueza5320 5 лет назад
I'm in loved. The host is so gorgeous
@atineshs
@atineshs 4 года назад
07:13 Please, I want to get fixed
@dhnguyen68
@dhnguyen68 3 года назад
@Hussain Mirahmadi the probability zero doesn’t mean it won’t ever happen !
@allseasons765
@allseasons765 5 лет назад
To help train my inner accent classifier, please supervise my classification on your accent: Are you originally from Bulgaria? If not a close-by Balkan country I'd say
@Parteque
@Parteque 4 года назад
just russia, man
@anassa6737
@anassa6737 Год назад
I Hope my university get this teacher :(
@ravish4723
@ravish4723 4 года назад
Nice explanation with a working example. But the part about normalization was not as clearly illustrated.
@prasanjitrath281
@prasanjitrath281 3 года назад
At 5:13 isn't the bigram language model example actually a unigram example. P(wi | wi-1), this is dependency on just the last word (a unigram). To double check since it depends only on the previous word, its like a 1st order Markov chain. So, is it not a unigram actually?
@itsvollx9684
@itsvollx9684 3 года назад
great concept for dyslexics
@mihaiandrei3144
@mihaiandrei3144 4 года назад
Just subscribed to the channel because of the "fix you" spelling mistake :)) Funny. Leave aside the emotions tho, everyone is allowed do mistakes. You are doing a great job anyway
@danielbaena4691
@danielbaena4691 4 года назад
Thank you so much for this video!, helped me a lot!
@darksideofthemoon3185
@darksideofthemoon3185 5 лет назад
Thank you so much. This helped a lot!
@PrabhakarKrishnamurthyprof
@PrabhakarKrishnamurthyprof 5 лет назад
Thank you and it is highly helpful.
@DielsonSales
@DielsonSales 5 лет назад
Nice explanation, but there's an ad disturbing the reasoning every few minutes.
@RealMcDudu
@RealMcDudu 4 года назад
adblock my friend, will save your life
@DielsonSales
@DielsonSales 4 года назад
@@RealMcDudu I know, I simply choose to let the providers to manage it well. I still prefer RU-vid with ads than Google shutting it down because it wasn't lucrative.
@orangbahy
@orangbahy 2 года назад
Thanks for the explanation!
@Areeva2407
@Areeva2407 4 года назад
Very good . In Bigram Model what is relation betweeen n & K ?
@sourojitsen5817
@sourojitsen5817 5 лет назад
7:14-7:17 funny
@zingg7203
@zingg7203 4 года назад
Too much Coldplay
@salilbagga
@salilbagga 5 лет назад
this is amazing but i found that this is a part of some course that you are taking can ik what course it is ?
@MachineLearningTV
@MachineLearningTV 5 лет назад
Dear, as the name of our channel shows, we find good Machine Learning videos and share them with students. This video is one of the courses of Coursera. If you go there you may find the whole course. Thanks
@muhammadmubashirullah7152
@muhammadmubashirullah7152 4 года назад
Language Processing. It links it in the end, and the name of the university is also shared in the logo.
@deeppant2866
@deeppant2866 2 года назад
good video, very helpful. thx!
@mihirmehta2242
@mihirmehta2242 3 года назад
I guess there is a mistake at 6:04 regarding the probability of the given is P(the|is)=0.5 rather than being 1
@davidspang9808
@davidspang9808 9 месяцев назад
Yes you're partly right. The probability of the last two terms have been mixed up ultimately erasing the mistake. p(the|is) = 1/2 while p(house|the) = 1
@nathanaelngami9147
@nathanaelngami9147 4 года назад
falling in love with that woman
@veracru7471
@veracru7471 4 года назад
x2
@dreamscapeai7
@dreamscapeai7 4 года назад
lol dude study for your exams
@SomethingSmellsMichy
@SomethingSmellsMichy 3 года назад
Simp
@manthanrai1096
@manthanrai1096 2 года назад
Very well explained 👏
@daesoolee1083
@daesoolee1083 5 лет назад
Nice explanation! Easy to follow :)
@luislptigres
@luislptigres 5 лет назад
What happened to andre? :(
@itechanalyticsolutions
@itechanalyticsolutions 4 года назад
great work
@nehalbafna8915
@nehalbafna8915 3 года назад
it feels like you are reading from somewhere and not feeling the concept.
@thefantasticman
@thefantasticman Год назад
hard to foucus on ppt can any one explain me why ?
@gabrielalexander1001
@gabrielalexander1001 4 года назад
really helpfull
@AhmedMostafa-gr9ff
@AhmedMostafa-gr9ff 4 года назад
are you from russian ???
@usurper1091
@usurper1091 10 месяцев назад
7:10
@richardn4483
@richardn4483 5 лет назад
Good
@sereneThePity
@sereneThePity 2 года назад
first, i will show how to fix you
@zingg7203
@zingg7203 4 года назад
Nice accent
@benjamindeppen2694
@benjamindeppen2694 5 лет назад
I’m so confused. This does not help
@tekki.dev.
@tekki.dev. 5 лет назад
You need the basics first
@roymoran1151
@roymoran1151 3 года назад
Could you suggest some of the basics @Bilal Raja?
@fazalmuhammad5467
@fazalmuhammad5467 3 года назад
@@tekki.dev. Please do tell me about the basics....
@bhaveshsingh0124
@bhaveshsingh0124 2 года назад
How to fix you hahah
@unsaturated8482
@unsaturated8482 4 года назад
youre so tense and tired it either the english or the dress great vid anyway
@AnonymousAccount514
@AnonymousAccount514 3 года назад
She’s hot
@racekille
@racekille 5 лет назад
I love her accent, can u be my wife? :D
@darksideofthemoon3185
@darksideofthemoon3185 5 лет назад
lol! hell no she can't!
Далее
Language Model Evaluation and Perplexity
6:46
Просмотров 19 тыс.
3 1 Introduction to N grams 8 41
8:42
Просмотров 25 тыс.
Forget words when speaking? Fix it instantly!
13:40
Просмотров 511 тыс.
What is NLP (Natural Language Processing)?
9:37
Просмотров 252 тыс.
N-Grams in Natural Language Processing
3:33
Просмотров 65 тыс.
How I'd Learn AI (If I Had to Start Over)
15:04
Просмотров 836 тыс.
3 2 Estimating N gram Probabilities 9 38
9:39
Просмотров 26 тыс.