Тёмный
No video :(

The Math Behind Bayesian Classifiers Clearly Explained! 

Normalized Nerd
Подписаться 100 тыс.
Просмотров 88 тыс.
50% 1

Опубликовано:

 

5 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 87   
@hayleyH997
@hayleyH997 4 месяца назад
How he manage to explain something that a 1-hr lecture couldn't! Thanks mate
@BrianAmedee
@BrianAmedee 4 года назад
'Clearly Explained' - and it actually was. Thanks man
@NormalizedNerd
@NormalizedNerd 4 года назад
:D :D
@pradyumnabada5118
@pradyumnabada5118 Год назад
Dude.. I lost count of the videos I watched to understand this but lastly, after seeing your video the struggle ended. Thank you so much!
@bluestar2253
@bluestar2253 3 года назад
One of the best explanations I've ever seen!
@NormalizedNerd
@NormalizedNerd 3 года назад
Thanks mate! Keep supporting...
@jaster_mereel7657
@jaster_mereel7657 3 года назад
This was a very clear explanation indeed. Thank you!
@NormalizedNerd
@NormalizedNerd 3 года назад
You're very welcome!
@sye9522
@sye9522 4 месяца назад
HUGE thanks for perfectly delivering the whole concept in one video bro!!
@lakshuperiakaruppan6777
@lakshuperiakaruppan6777 28 дней назад
Good work with the visuals!!
@RayRay-yt5pe
@RayRay-yt5pe 18 дней назад
You did good my friend. I'm glad I came across this video
@radoyapanic998
@radoyapanic998 2 года назад
In the last part of the video you said we can fit a known distribution to a continuous set of data. However, you continued to then write that the probabilities can be calculated by taking the product of the pdf evaluated at different values of the feature and label. The pdf does not provide probabilities however, as it needs to be integrated to inform one of the probabilities of an event. This part of the video seems imprecise. However, the video in general was great. Thanks.
@noname-anonymous-v7c
@noname-anonymous-v7c 7 месяцев назад
9:37 you made conclusion based on P(X=[0,2] | Y), I think the correct way is to calculate P(Y|X=[0,2]). In case P(Y=1) is very small, the answer can be Y=0.
@miusukamadoto6805
@miusukamadoto6805 2 года назад
Thank you very much for the video. Clearly explained indeed, the only part I couldn't get completely was the discretization.
@jefersondavidgalloaristiza3410
@jefersondavidgalloaristiza3410 9 месяцев назад
Very nice explanation and perfect illustrations!!
@guangruli4486
@guangruli4486 3 года назад
Very clearly explained, thank you!
@EduAidClassroom
@EduAidClassroom 2 года назад
LOVED IT!!! Awesome Explanation! Can't thank you enough...
@sopegue
@sopegue 2 года назад
It was clearly explained as mentionned in the title. Thanks a bunch !!!
@sayonsom
@sayonsom Год назад
Great explanation :)
@dannysammy8972
@dannysammy8972 2 года назад
Yes, this was actually well explained. Thank you :)
@PritishMishra
@PritishMishra 3 года назад
If I search for any ML Algorithm I just first check your channel If you have created the video on the same... You are my first preference for ML/DL Algo Explanation. Just a request please make a video on Deep Learning Algorithm too like CNN, RNN & LSTM "from scratch". It will really help people who want to become practitioners in AI like me.
@NormalizedNerd
@NormalizedNerd 3 года назад
Thank you so much ❤ Writing CNNs and RNNs from scratch are pretty hectic...maybe some day I'll try.
@PritishMishra
@PritishMishra 3 года назад
@@NormalizedNerd Waiting... you are our only hope who can teach us Mathematics of ML with cool animation, That's why requested you! Thanks.
@daniilsukhovv
@daniilsukhovv 3 года назад
bro, best explanation I could find
@NormalizedNerd
@NormalizedNerd 3 года назад
Thanks bro :)
@arielalvarez88
@arielalvarez88 3 года назад
Really good work, congrats
@NormalizedNerd
@NormalizedNerd 3 года назад
Thanks man!
@imadeit6587
@imadeit6587 3 года назад
I am appreciate your work
@NormalizedNerd
@NormalizedNerd 3 года назад
Thanks a lot!
@parisaghanad8042
@parisaghanad8042 2 года назад
That was great! I'm really glad that I found your channel. Thanks a lot 👍👍
@DANstudiosable
@DANstudiosable 4 года назад
Well explained, a quick revision for Naive bayes. I forgot why it was called Naive until i watched this video 😂😂
@NormalizedNerd
@NormalizedNerd 4 года назад
Thanks! Haha.
@aurorasart9458
@aurorasart9458 3 года назад
Thank you very much for your work! Nice explanation!
@NormalizedNerd
@NormalizedNerd 3 года назад
You are welcome!
@high_fly_bird
@high_fly_bird Год назад
The explanation is so cool! But it would be even cooler if you added some examples with continious features and fitting a distribution, this part wasn't so clear...
@sobana653
@sobana653 Год назад
Nicely explained!
@mehditavakoli2492
@mehditavakoli2492 Год назад
Thank you!
@muhammadzubairbaloch3224
@muhammadzubairbaloch3224 4 года назад
sir please more lectures. I am seeing after too days later your lectures made some advance NLP and CV lectures or AI lectures thanks
@NormalizedNerd
@NormalizedNerd 4 года назад
I will try my best to upload more frequently.
@swethanandyala
@swethanandyala 2 года назад
very nice explanation thank you so much
@atulyadav9712
@atulyadav9712 2 года назад
Great explanation
@dpaul3447
@dpaul3447 Год назад
Thank you so much man!!
@aymericalixe1310
@aymericalixe1310 3 года назад
Maybe i'm wrong but I think the hypothesis is not that X1 and X2 are independant but that X1 and X2 are conditionnaly independant. It was very clear otherwise thank you !
@NormalizedNerd
@NormalizedNerd 3 года назад
In naive Bayes every feature is treated as an independent feature that's why it's called naive.
@chitranghosal879
@chitranghosal879 Год назад
I think the hypothesis is that you assume each feature to be (w.r.t other features) 1) globally independent (in the global sample space) 2)conditionally independent w.r.t the occurence of each class label (under the subset sample space where the particular class event has occured) If these assumptions are not met, then it does not seem possible to build the mathematics, because as far as I see, if events A and B are independent, that does not naturally imply conditional independence between events (A|C) and (B|C)
@lucasqwert1
@lucasqwert1 Год назад
in the last part at minute 11: What is the function f to fit a known distribution? Thank you for answering!
@leolei9352
@leolei9352 2 года назад
Very clear explanation!
@user-wr4yl7tx3w
@user-wr4yl7tx3w Год назад
This is really well explained.
@hasben0
@hasben0 Год назад
Well done👊👊
@sayantansadhu6380
@sayantansadhu6380 4 года назад
It was like a revision for class 12 probability 😁😁
@NormalizedNerd
@NormalizedNerd 4 года назад
Yeah simple yet effective concept.
@nikolai228
@nikolai228 6 месяцев назад
Amazing video. thanks.
@user-or7ji5hv8y
@user-or7ji5hv8y 3 года назад
Great explanation.
@NormalizedNerd
@NormalizedNerd 3 года назад
Glad it was helpful!
@AnasHawasli
@AnasHawasli 6 месяцев назад
Great video man great herre is a sub
@SarahGhiyasi
@SarahGhiyasi Год назад
Thank u it was great.
@fmt2586
@fmt2586 2 года назад
hey, thanks man, very clear explanation.😀😀
@MrDaniel560
@MrDaniel560 Год назад
HELPFUL!!!!
@prar_shah
@prar_shah Месяц назад
Love this
@joaomatheusnascimentogonca7633
@joaomatheusnascimentogonca7633 2 месяца назад
10:51 How does this work? wouldn't the probability that Xi = xi be zero, given we're using a continuous distribution? Because of the "=" sign
@dzmitryk9658
@dzmitryk9658 2 года назад
Awesome! Thank you.
@adityaprasad3356
@adityaprasad3356 Год назад
very helpful🥺🥺
@telusukondifirstuu9221
@telusukondifirstuu9221 2 года назад
I love this Exolaination 😍🥰😘 Thanks a lot ❤
@signature445
@signature445 3 года назад
Sir is this like Bayesian classifier deals with conditional probability ? Naïve bays classifier deals with joint probability ? Thanks in advance.....
@NormalizedNerd
@NormalizedNerd 3 года назад
Yeah!
@sumedha1051
@sumedha1051 Год назад
love this!
@nickgannon7466
@nickgannon7466 2 года назад
well done
@quanghuynh1570
@quanghuynh1570 Год назад
you saved me
@aditya.singh9
@aditya.singh9 3 года назад
truly amazing
@NormalizedNerd
@NormalizedNerd 3 года назад
Thanks!
@Fuktron13
@Fuktron13 3 года назад
I wish you were my professor
@plumSlayer
@plumSlayer Год назад
You areee Amazing. I love your Indian Bengali accent ( just a guess hehe make me a voice analyzer if i am right XD )
@mahirjain8898
@mahirjain8898 9 месяцев назад
so goood
@zouhir2010
@zouhir2010 3 года назад
thumbs up thanks
@kunalsoni7681
@kunalsoni7681 Год назад
Nice ⭐⭐⭐⭐⭐
@Ilham-lj3me
@Ilham-lj3me Год назад
and how aboit gaussian NB?
@xritzx
@xritzx 4 года назад
3b1b's bro is here
@NormalizedNerd
@NormalizedNerd 4 года назад
Haha :3
@harshitdtu7479
@harshitdtu7479 4 месяца назад
10:37
@pushandeb187
@pushandeb187 11 месяцев назад
liked that
@abdulkarim.jamal.kanaan
@abdulkarim.jamal.kanaan 3 года назад
Hello people from the future! :D
@davidmurphy563
@davidmurphy563 10 месяцев назад
Ok, I've given up on the video after 45 secs. You said "stated clearly", if you hadn't I'd have kept watching. You point to an array of features called X. What are they? Are they features of the array itself (its size / rank / dimension?), are they features of the thing the array of describing (measurements in a house?), or a list of possible attributes (the ingredients on a pizza?) Then you introduce a label. So what, is this like a python dictionary? Plus, I've no idea what sort of issue we're supposed to be tackling? Is it probability? Is it rationality with limited knowledge? I only guess that because I've heard of Bayes before. Instead you launch into calculations when I have not the first idea what you're calculating. Why would I listen to that? Tell you what, I'll give it another 30 secs. If there's no illustrative example / clear explanation of what the hell you're covering I'm gone.
@davidmurphy563
@davidmurphy563 10 месяцев назад
Nope, 30 secs later and it's absolute horseshit.
@anon_148
@anon_148 2 года назад
independant moment
@mahedihassanrafin7493
@mahedihassanrafin7493 11 месяцев назад
just quit confusing people
@thebjjtroll6778
@thebjjtroll6778 2 года назад
Amazing teaching skills
@vojinivkovic9533
@vojinivkovic9533 2 года назад
great explanation
Далее
Naive Bayes Classifier in Python (from scratch!)
17:15
Naive Bayes, Clearly Explained!!!
15:12
Просмотров 1 млн
when you have plan B 😂
00:11
Просмотров 3,1 млн
Naive Bayes classifier: A friendly approach
20:29
Просмотров 142 тыс.
The Bayesian Trap
10:37
Просмотров 4,1 млн
Bayes' Theorem EXPLAINED with Examples
8:03
Просмотров 311 тыс.
Decision Tree Classification Clearly Explained!
10:33
Просмотров 657 тыс.
Gaussian Naive Bayes, Clearly Explained!!!
9:26
Просмотров 337 тыс.
How to systematically approach truth - Bayes' rule
19:08
Bayes theorem, the geometry of changing beliefs
15:11
Random Forest Algorithm Clearly Explained!
8:01
Просмотров 596 тыс.
Bayesian Networks
39:57
Просмотров 313 тыс.