Тёмный
No video :(

Shannon Entropy and Information Gain 

Serrano.Academy
Подписаться 154 тыс.
Просмотров 204 тыс.
50% 1

Опубликовано:

 

23 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 330   
@sphengle
@sphengle 5 лет назад
This was exactly the baby step I needed to get me on my way with entropy. Far too many people try to explain it by going straight to the equation. There's no intuition in that. Brilliant explanation. I finally understand it.
@jankinsics
@jankinsics 4 года назад
Sean Walsh feel the same way.
@freemanguess8634
@freemanguess8634 6 лет назад
With great knowledge comes low entropy
@SerranoAcademy
@SerranoAcademy 6 лет назад
Hahaaa, love it!!!
@fantomraja9137
@fantomraja9137 4 года назад
lol
@hyperduality2838
@hyperduality2838 4 года назад
@@SerranoAcademy Repetition (redundancy) is dual to variation -- music. Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle. Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics. Randomness (entropy) is dual to order (predictability) -- "Always two there are" -- Yoda.
@MrofficialC
@MrofficialC 2 года назад
And low entropy is easier to rig
@lani0
@lani0 2 года назад
You win
@user-or7ji5hv8y
@user-or7ji5hv8y 3 года назад
how does one make something so complicated into something so intuitive that others can finally see the picture. your explanation itself is an amazing feat.
@carnivalwrestler
@carnivalwrestler 5 лет назад
Luis, you are such an incredibly gifted teacher and so meticulous in your explanations. Thank you for your hard work.
@AlexMcClung97
@AlexMcClung97 6 лет назад
Excellent explanation, very clear and concise! I have always pondered the significance of the log in cross-entropy loss function. The explanation (particularly: "products are small and volatile, sums are good") completely clears this up.
@effemmkay
@effemmkay 4 года назад
I have been scared of delving into entropy in detail for so long because the first time I studied it, it wasn’t a good experience. All I want to say is THANK YOU!!!!!! I should have been supplementing the udacity ND lesson videos with these since the beginning.
@drakZes
@drakZes 5 лет назад
Great work. Compared to my textbook you explained it 100 times better, Thank you.
@123liveo
@123liveo 5 лет назад
2nd time I found this video and loved it both times. Much better description than the prof at the uni I am at!!!
@poxyu_was_here
@poxyu_was_here 6 лет назад
Easy and Great explanation! Thank you very much, Luis
@Asli_Dexter
@Asli_Dexter 6 лет назад
i wish i had this lecture during college examination.....still it's nice to finally understand the intuition behind the formulas i already knew.
@pixboi
@pixboi 5 лет назад
Teaching should be like this, from practice to theory - no the other way around!
@dyutinrobin
@dyutinrobin 4 месяца назад
Thank you so much. This was the only video in youtube that clarified all my doubts regarding the topic of entropy.
@ketlebelninja
@ketlebelninja 5 лет назад
This was one of the best explanations on entropy. Thanks
@eprabhat
@eprabhat 6 лет назад
Luis, You have a great way of explaining. At times , I like your videos more than even some highly rated professors
@elmoreglidingclub3030
@elmoreglidingclub3030 3 года назад
Excellent! Great explanation. Enjoyable video (except YT’s endless, annoying ads). Thank you for composing and posting.
@NoOne-uz4vs
@NoOne-uz4vs 4 года назад
I'm studying Decision Tree (Machine Learning Algorithm) and it uses Entropy to efficiently build the tree. I finally understand the details. Thank you!!
@msctube45
@msctube45 4 года назад
I needed this video to get me up to speed on entropy. Great job Luis!
@mau_lopez
@mau_lopez 6 лет назад
What a great explanation ! I wish I had a teacher like you Luis, everything wold be way easier ! Thanks a lot
@patricklemaire225
@patricklemaire225 6 лет назад
Great video! Now I understand what Claude Shannon discovered and how useful and essential maths are in Computer Science.
@therealsachin
@therealsachin 6 лет назад
The best explanation about Shannon entropy that I have ever heard. Thanks!
@sdsa007
@sdsa007 Год назад
Wow! Awesome, so books and encyclopedias and biographies of Shannon to understand what you just clearly explained! Thank You!
@sasthra3159
@sasthra3159 2 года назад
Great clarity. Have never got this idea about the Shannon Entropy. Thank you. Great work!
@RyanJensenEE
@RyanJensenEE Год назад
Good video! Minor correction of calculations: at 5:50, the probability of getting the same configuration is 0.25. This is because there are only 4 possible configurations of the balls (there is only one blue ball, and only four slots, so only 4 places the blue ball can be). This can also be calculated by selecting red balls first multiplying 0.75 * 0.66667 * 0.5 = 0.25. Similarly, at 6:58, the probability is 1/6 because there are 6 possible configurations. We can calculate the probability by multiplying (2/4) * (1/3) = (2/12) = (1/6) ~= 0.166667.
@Bvic3
@Bvic3 5 лет назад
At 13:44 it's not 0.000488 but 0.00006103515 ! There is a computation error. The entropy is correct, 1.75.
@SerranoAcademy
@SerranoAcademy 5 лет назад
Thank you for the correction! Yes, you're right.
@SenhorMsandiFelipe
@SenhorMsandiFelipe 2 года назад
Gracias. Muito claro Senhor. I have been struggling to wrap my head around this and you just made it easy. Thank you.
@jackallread
@jackallread 7 месяцев назад
Thanks for the relationship between knowledge and entropy, that was very helpful. Your explanation of statistics is also good! Though, I am only half way through the video at this point, I will finish it! Thanks
@MatheusSilva-dragon
@MatheusSilva-dragon 5 лет назад
Wow, thank you, man. I needed that information! There are many ways to teach the same stuff! That number of question stuff is great! It's good to have more than one way to measure something!
@generationgap416
@generationgap416 3 года назад
Confession: I was a math kiddy; I know to use it but I often missed the deeper meaning and intuition. Your videos are turning me into a math hacker.
@Vuvuzella16
@Vuvuzella16 4 года назад
This video is helping to keep me floating in my Data Science course; thank you so much for your time!
@logosfabula
@logosfabula 6 лет назад
Luis, you really are a great communicator. Looking forward to your other explanations.
@shekelboi
@shekelboi 5 лет назад
Thanks a lot Luis, just had an exam about this Wednesday and your video helped me a lot to understand the whole concept.
@mulangonando2942
@mulangonando2942 Год назад
I love the explanation of the negative sign in the Entropy Equation many people wonder
@dianafarhat9479
@dianafarhat9479 5 месяцев назад
Can you make a part 2 with the full proof, not just the intuition behind the formula? Your explanation's amazing & would love to see a part 2.
@mehmetzekeriyayangn3782
@mehmetzekeriyayangn3782 5 лет назад
You are the best.Such a great explanation.Better than lots of text books.
@pkittali
@pkittali 6 лет назад
Lovely explanation...Superb
@karinasakurai9867
@karinasakurai9867 4 года назад
Brilliant lecture! I learn so much with this explanation. Thanks from Brazil :)
@Skandar0007
@Skandar0007 5 лет назад
That moment when you realize you don't need to search for another video because you got it from the first time. What I'm trying to say is Thank You!
@paulstevenconyngham7880
@paulstevenconyngham7880 6 лет назад
this is a really great explanation, thanks so much for sharing mate!
@subhashkonda5000
@subhashkonda5000 6 лет назад
Its always hard to understand the equations but u made it so simple :-)
@hanaelkhalifa2630
@hanaelkhalifa2630 3 года назад
Thank you for excellent explanation of entropy concept first... Then reach to final equation step-by-step it is really good and simple way
@jordyb4862
@jordyb4862 9 месяцев назад
I find sum(p*log(p^-1)) more intuitive. Inverse p (i.e. 1/P) is the ratio of total samples to this sample. If you ask perfect questions you'll ask log(1/p) questions. Entropy is then the sum of these values, each multiplied by the probability of each, which is how much it contributes to the total entropy.
@SixStringTheory6
@SixStringTheory6 6 лет назад
Wow ..... I wish more people could teach like you this is so insightful
@user-xo2vg9om4v
@user-xo2vg9om4v 4 года назад
It's very helpful for me to introduce the concept of entropy to students. Thank you for your clear presentation of entropy.
@victorialeigh2726
@victorialeigh2726 3 года назад
Hola Luis, estupendo, espectacular, excelente!
@rajudey1673
@rajudey1673 3 года назад
Really, you have given us outstanding information.
@Kani132
@Kani132 3 года назад
Very nice video. Insightful, inutuitive and very well explained. Thank you!
@bismeetsingh352
@bismeetsingh352 4 года назад
That was highly intuitive, thank you, sir, I appreciate the effort behind this.
@haimmadmon3531
@haimmadmon3531 4 года назад
Very good explanation - hope to hear more of your videos
@AJK544
@AJK544 4 года назад
your explain is perfect. Even though I am not good at listening english. I can understand everything :)
@kleberloayza7839
@kleberloayza7839 5 лет назад
hi Luis, nice to meet you, I am reading the book of Deep learning of Ian Godfellow, and I needed to view your video for understand the chapter, 3.13 information theory. thanks very much.
@cariboux2
@cariboux2 3 года назад
Luis, Thank you so much for this brilliant elucidation of information theory & entropy. Merely as an avocation, I have been toying around with a pet evolutionary theory about belief systems and societies. In order to test it - if that is even possible - I felt I needed to develop some sort of computer program as a model. Since I have very little programming experience and only mediocre math skills, I have been teaching myself both (with a lot of help from the web). It was purely by accident that I stumbled upon Claude Shannon and information theory, and I immediately became fascinated with the topic, and have a hunch that it may somehow be relevant to my own research. Regardless, I am now interested in it for its own sake. I had a an ephemeral understanding of how all the facets (probability, logs, choices, etc.) were all related mathematically, but it wasn't until after watching your video that I believe I fully grok the concept. At one point early on, I found myself shouting, "if he brings up yes/no questions, I know I understand this!" And then you did. It was such a wonderful moment for someone who finds math so challenging, and it is greatly appreciated! I shall check out your other videos later. You're a very good teacher!
@Faustus_de_Reiz
@Faustus_de_Reiz Год назад
For your work, I would look into some of the work by Loet Leydesdorf.
@cariboux2
@cariboux2 Год назад
@@Faustus_de_Reiz Thank you! I shall.
@TheZilizopendwa
@TheZilizopendwa 3 года назад
Excellent presentation for an otherwise complex concept.
@christinebraun9610
@christinebraun9610 4 года назад
Great explanation. But I think what’s still missing is an explanation of why we use log base 2....didn’t quite get that
@olivercopleston
@olivercopleston 4 года назад
In the last minute of the video, he explains that using Log base 2 corresponds to the level of a decision tree, which is the number of questions you'd have to ask to determine a value.
@amperro
@amperro 3 года назад
I watched it straight through. Very good.
@user-wi1rj4iw9y
@user-wi1rj4iw9y 3 года назад
十分谢谢! Thank you very much, Luis.
@rolfbecker4512
@rolfbecker4512 3 года назад
Thank you very much for this beautiful and clear explanation!
@erdalkaraca2213
@erdalkaraca2213 5 лет назад
So, after watching the video, the entropy for giving you thumbs up and subcribing to your channel was 0 - i.e. great explanation!
@moshad2002
@moshad2002 5 лет назад
amazing explanation!! very clear and articulate. Thank you!
@scherwinn
@scherwinn 5 лет назад
Very clever explanation of mighty ENTROPY.
@hyperduality2838
@hyperduality2838 4 года назад
Syntropy is dual to increasing entropy -- The 4th law of thermodynamics! Thesis is dual to anti-thesis -- The time independent Hegelian dialectic. Schrodinger's cat: Alive (thesis, being) is dual to not alive (anti-thesis, non being) -- Hegel's cat. Syntropy is the process of optimizing your predictions to track targets or teleological physics. Teleological physics (syntropy) is dual to non teleological physics (entropy, information).
@yikenicolezhang
@yikenicolezhang 3 года назад
Thank you so much for explaining this concept!
@francismcguire6884
@francismcguire6884 5 лет назад
Best instructor there is! Thanks
@aymenalawadi7858
@aymenalawadi7858 4 года назад
If Shannon were alive, he would enjoy seeing such a perfect explanation for his theory. Many thanks.
@MrMaipeople
@MrMaipeople 5 лет назад
Great and Superb. Thank you so much for the most clearer explanation.
@RobertLugg
@RobertLugg 6 лет назад
I have learned so much from your teaching. Thank you.
@RenanCostaYT
@RenanCostaYT 4 года назад
Great explanation, greetings from Brazil!
@clarakorfmacher7394
@clarakorfmacher7394 4 года назад
Great Video! I really liked the intuitive approach. My professors was waaaay messier.
@jonathanfrancis
@jonathanfrancis 3 года назад
Wow. Amazing video.
@MostafaMASLOUHI
@MostafaMASLOUHI 3 года назад
Thank you very much. Very nice explanation.
@hanaizdihar4368
@hanaizdihar4368 3 года назад
What a great explanation! And so i subscribed😊
@scherwinn
@scherwinn 6 лет назад
Mr. Luis Serrano III great job in Neural Network and Claude Shannon Entropy.
@kingshukbanerjee748
@kingshukbanerjee748 5 лет назад
very lucid explanation - excellent, intuitive build-up to Shannon's theorem from scratch
@meshackamimo1945
@meshackamimo1945 5 лет назад
Hi. Thanks a million times for simplifying a very complicated topic. Kindly find time n post a simplified tutorial on mcmc.... I am overwhelmed by your unique communication skills. Markov chain Monte Carlo. God bless you.
@generationgap416
@generationgap416 3 года назад
Help us smash Markov chain Monte Carlo
@user-or7ji5hv8y
@user-or7ji5hv8y 4 года назад
wow, another great and insightful presentation . really helps to build intuition
@tilugulilwa
@tilugulilwa 4 года назад
Superb step by step explanation
@carlitos5336
@carlitos5336 3 года назад
Excelente explicación! Gracias por compartirla.
@Dennis12869
@Dennis12869 5 лет назад
Best explanation I found so far
@kasraamanat5453
@kasraamanat5453 2 года назад
best, as always ❤️ thank you Luis❤️
@Omsip123
@Omsip123 8 месяцев назад
Very well explained, thank you
@aryamahima3
@aryamahima3 3 года назад
Thank you so much for a such a easy explanation...respect from india...
@patriciof.calatayud9861
@patriciof.calatayud9861 3 года назад
I think that the Huffman compression that you use and the end of the video is near the entropy value but not exactly the same
@FabioLenine
@FabioLenine 6 лет назад
Fácil de compreender por causa da excelente explanação. Parabéns pelo vídeo e muito obrigado por compartilhar. / Easy to think because of the excellent explanation. Congratulations on the video and thank you very much for sharing.
@shakeelurrahman1846
@shakeelurrahman1846 Год назад
thanks a lot for such a beautiful explanation..!
@symnshah
@symnshah 6 лет назад
Nice explanation. Thank you so much for this video.
@xThomas1995
@xThomas1995 4 года назад
Thank you for the very good video. Easiest to understand so far.
@hcgaron
@hcgaron 6 лет назад
wonderful video. Thanks!
@emrahyener402
@emrahyener402 2 года назад
Thanks for this perfect explanation 👏👏👏👍
@KayYesYouTuber
@KayYesYouTuber 4 года назад
Superb explanation. I like your teaching style. Thank you very much :-)
@krupalshah1487
@krupalshah1487 5 лет назад
You explained it really very well. Thank you !
@markbordelon1601
@markbordelon1601 6 лет назад
Perfect explanation of this
@nijunicholas631
@nijunicholas631 4 года назад
Thanks..Got the intuition behind Entropy
@MH_HD
@MH_HD 5 лет назад
This is the best explanation I have come across for a long time, Can you please answer how can we use entropy to find the uncertainty of a naive Bayesian classifier with let's say 4 feature variables and a binomial class variable?
@JabaDr
@JabaDr 5 лет назад
Great video!! Thank You. Would be great to add some explanation for information gain (as for example used for feature selection)
@nassimbahri
@nassimbahri 5 лет назад
For the first time in my life i understand the real meaning of the Entropy
@archerchian7761
@archerchian7761 3 года назад
Very clear, thank you!
@saadjabrani7781
@saadjabrani7781 5 лет назад
easy and thorough explanation, thank you !
@LAChinthaka
@LAChinthaka 4 года назад
Thanks for the great explanation.
@miguelfernandosilvacastron3279
@miguelfernandosilvacastron3279 4 года назад
Thank you. Nice, concise explanation.
@sirishachanumolu3723
@sirishachanumolu3723 5 лет назад
Best explanation of entropy. Thanks.
@sosoboy77
@sosoboy77 4 года назад
Best video this week
@yhat314
@yhat314 6 лет назад
Lovely job Luis! Very very good!
Далее
Naive Bayes classifier: A friendly approach
20:29
Просмотров 142 тыс.
Entropy (for data science) Clearly Explained!!!
16:35
Просмотров 597 тыс.
Cristiano ronaldo VS Tibo Inshape ! 😱😱
00:20
Просмотров 1,6 млн
Девочки, у вас тоже так? 💅🏻✨
00:17
#JasonStatham being iconic
00:38
Просмотров 357 тыс.
Claude Shannon - Father of the Information Age
29:32
Просмотров 354 тыс.
Why Information Theory is Important - Computerphile
12:33
Solving Wordle using information theory
30:38
Просмотров 10 млн
Why 7 is Weird - Numberphile
12:03
Просмотров 1,8 млн
The Startling Reason Entropy & Time Only Go One Way!
13:49
Entropy in Compression - Computerphile
12:12
Просмотров 391 тыс.
The SAT Question Everyone Got Wrong
18:25
Просмотров 12 млн
Information Theory Basics
16:22
Просмотров 64 тыс.