Тёмный

Information entropy | Journey into information theory | Computer Science | Khan Academy 

Khan Academy Labs
Подписаться 54 тыс.
Просмотров 319 тыс.
50% 1

Finally we arrive at our quantitative measure of entropy
Watch the next lesson: www.khanacadem...
Missed the previous lesson? www.khanacadem...
Computer Science on Khan Academy: Learn select topics from computer science - algorithms (how we solve common problems in computer science and measure the efficiency of our solutions), cryptography (how we protect secret information), and information theory (how we encode and compress information).
About Khan Academy: Khan Academy is a nonprofit with a mission to provide a free, world-class education for anyone, anywhere. We believe learners of all ages should have unlimited access to free educational content they can master at their own pace. We use intelligent software, deep data analytics and intuitive user interfaces to help students and teachers around the world. Our resources cover preschool through early college education, including math, biology, chemistry, physics, economics, finance, history, grammar and more. We offer free personalized SAT test prep in partnership with the test developer, the College Board. Khan Academy has been translated into dozens of languages, and 100 million people use our platform worldwide every year. For more information, visit www.khanacademy.org, join us on Facebook or follow us on Twitter at @khanacademy. And remember, you can learn anything.
For free. For everyone. Forever. #YouCanLearnAnything
Subscribe to Khan Academy’s Computer Science channel: / channel
Subscribe to Khan Academy: www.youtube.co...

Опубликовано:

 

11 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 190   
@vandanachandola322
@vandanachandola322 4 года назад
I remember my teacher in high school defined entropy as "the degree of randomness". I decided it was an abstract concept that I don't get. Now learning about information entropy in my master's class, I found this video and I'm so glad I did!! Thanks, it's very well explained :)
@itsRAWRtime007
@itsRAWRtime007 9 лет назад
good video. i like the way it shows the intuition behind the concept, that is the reason why the concepts actually exists rather than plainly defining it and then showing its properties.
@youngsublee1102
@youngsublee1102 4 года назад
couldn't agree more
@amirizaiah7179
@amirizaiah7179 3 года назад
I guess Im kind of randomly asking but do anybody know of a good site to watch newly released series online?
@andrewrandall9989
@andrewrandall9989 3 года назад
@Amir Izaiah Flixportal :D
@amirizaiah7179
@amirizaiah7179 3 года назад
@Andrew Randall Thank you, I signed up and it seems like a nice service :) I appreciate it !!
@andrewrandall9989
@andrewrandall9989 3 года назад
@Amir Izaiah you are welcome xD
@XavierGlenKuei
@XavierGlenKuei 6 лет назад
at 1:24, i would argue the 3rd question (ie, the question on the right of 2nd hierarchy) should be "Is it C?" (or "Is it D?") rather than "Is it B?" (i think this is so because, as the 1st machine answered "No" to the 1st question ["Is it AB?"], it essentially rules-out both A and B, leaving only C(or D) as the possible outcome; hence no role for "B" anymore)
@marcuschiu8615
@marcuschiu8615 5 лет назад
yea, I agree with you. dam, so many mistakes in this video, 1:24 and 3:50. makes me question their reliability... good video though
@dien2971
@dien2971 4 года назад
I thought I understood wrong lol. Thank you!
@muhaymenulislam1942
@muhaymenulislam1942 2 года назад
But here the probability of D is 25% which Is more than 12.5%, so in the second they ask is it D? .
@hrivera4201
@hrivera4201 2 года назад
previous lesson: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-WyAtOqfCiBw.html next lesson: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-TxkA5UX4kis.html
@phycosmos
@phycosmos Месяц назад
thanks
@pedrogorilla483
@pedrogorilla483 4 года назад
I have asked several professors in different universities and countries, why we adopted a binary system to process information and they all answered because you can modulate it with electricity, the state on or off. This never satisfied me. Today I finally understand the deeper meaning and the brilliance of binary states in computing and its interfacing with our reality.
@Dhanush-zj7mf
@Dhanush-zj7mf 4 года назад
1:24 You are asking same question twice because you already asked "Is it A or B" in the root if the answer is no that means "it will be either C or D" but you are asking again whether it is B, or not in the sub branch. It should be either "Is it C" or "Is it D".
@MohdFirdaus-fk6no
@MohdFirdaus-fk6no 3 года назад
yes, you are correct
@mathaha2922
@mathaha2922 4 года назад
This is one of the most informative -- and I use that term advisedly -- videos I have ever seen. Thank you!
@kempisabel9945
@kempisabel9945 4 года назад
this video blew my mind away. Thank you! I love these intelligent yet fun videos!
@daihung3824
@daihung3824 3 года назад
I have one question: Say p(A)= 0.45, p(B)=0.35, p(C)=0.15, p(D)=0.05, then 1/p(D)=20, log base 2 of 20 = approx 4.3, however, number of bounces should remain 3 is it? Would anyone mind explaining this possible difference? Thanks a lot!
@salrite
@salrite 6 лет назад
What a Beautiful Explanation!!!
@antoinecantin1780
@antoinecantin1780 2 года назад
What a formidable way of visualizing and introducing information entropy. Your contributions are deeply appreciated
@mikibellomillo
@mikibellomillo Месяц назад
note: number of bounces - entropy is maximum when all outcomes are equally likely . when introduce predictability the entropy must go down. thanks for sharing this video! God bless you!🎉
@someshsharma6683
@someshsharma6683 6 лет назад
Awesome explanation with a very intuitive example.Thanks a lot...
@karrde666666
@karrde666666 3 года назад
why can't textbooks or lectures be this easy
@Hopemkhize-d2i
@Hopemkhize-d2i 2 месяца назад
Tell me about it😢
@csaracho2009
@csaracho2009 24 дня назад
I have an answer for that: The zen pupil asks the master, is the flag moving with the wind? The master replies: neither the flag or the wind move, it is your mind that moves.
@miketor2011
@miketor2011 3 года назад
Great video but is it just me or there is an error on 3:49. The correct calculation for the number of bounces should be 0.5*1+0.125*3+0.125*3+0.25*2 = 1.75 instead the video shows 0.5*1+0.125*3+0.125*3+0.25*4 =2.25? Any thoughts?
@musicmaker33428
@musicmaker33428 3 года назад
I was just thinking this. Thank you for pointing it out. I thought maybe I misunderstood something fundamental.
@youngsublee1102
@youngsublee1102 4 года назад
Wonderful idea of "bounce" that express the amount of information. It's so exciting.
@MaryMary-ep4hd
@MaryMary-ep4hd 4 года назад
Ingenious interpretation! I applaud!
@twoplustwo5
@twoplustwo5 4 месяца назад
Kudos for linking number of bounces -> binary tree -> log. And overall very nice explanation. That's like 3rd explanation for info entropy i liked.
@edwardjurkowitz1663
@edwardjurkowitz1663 3 года назад
Excellent video. I think one point mistakenly refers to "information" when the author means 'entropy.' Machine 2 requires fewer questions. It produces more information and less entropy. Machine one produces maximum entropy and minimum information. Information is 'negative entropy.'
@YYchen713
@YYchen713 2 года назад
This is such a great way to explain information entropy! Classic!
@suryacharan5184
@suryacharan5184 4 года назад
What a video!!....This is how education should be.
@YuriValentines
@YuriValentines 2 года назад
This video has explained entropy better than any teacher I've had in my entire life. It makes me so angry to think of all my time wasted in endless lectures, listening to people with no communication skills.
@kartikbansal6439
@kartikbansal6439 4 года назад
Loved the piano bit towards the conclusion!
@FrancescoDeToni
@FrancescoDeToni 8 лет назад
Isn't there a mistake at 3:50? Shouldn't it be 0.25 x 2 instead of 0.25 x 4?
@philtrem
@philtrem 8 лет назад
+Francesco De Toni yup!
@sighage
@sighage 5 лет назад
Yes, it's 2.25 I guess
@rah2023
@rah2023 5 лет назад
It's indeed a mistake
@nikhilsrajan
@nikhilsrajan 3 года назад
@@sighage no it's 1.75, just there was a typo. you get 1.75 with 0.25 x 2
@boredomgotmehere
@boredomgotmehere Год назад
Makes it all so super clear and easy to follow. Love this.
@boredomgotmehere
@boredomgotmehere 6 месяцев назад
Just a tiny error at 3:50 - the final calculation shld be 0.25*2.
@OtakuRealist
@OtakuRealist Месяц назад
Thank you so much. It explains the entropy so well.
@fangyuanlin8966
@fangyuanlin8966 2 месяца назад
3:51 Typo when computing the entropy for machine 2
@TheLegendOfCockpunch
@TheLegendOfCockpunch 5 лет назад
The 'M' and 'W' are switched and upside down while the 'Z' is just a sideways 'N'...my vote is intentional 6:32
@raultellegen5512
@raultellegen5512 8 лет назад
Amazing video. Seldom seen a better explanation of anything. Thanks!
@Ewerlopes
@Ewerlopes 9 лет назад
Perfect explanation! :)
@waylonbarrett3456
@waylonbarrett3456 Год назад
I found a few errors. Am I the only one seeing this?
@mostafaomar5441
@mostafaomar5441 4 года назад
Thank you. Explains the intuition behind Entropy very clearly.
@russianescapist5262
@russianescapist5262 2 года назад
I loved this surreal music and real life objects to move in a grey 60s like atmosphere.)
@BambiOnIce19
@BambiOnIce19 2 года назад
Perfectly well explained. The best video on information entropy I’ve seen so far
@argha-qi5hf
@argha-qi5hf 2 года назад
I can't imagine how someone could ever come up with such abstract ideas.
@tythedev9582
@tythedev9582 4 года назад
Yessss I finally got the concept after this video.
@daviddeleon292
@daviddeleon292 5 лет назад
Huh??? Why am I finding out that information entropy was a concept. MIND BLOWN!!!
@user-vr1so7tc7x
@user-vr1so7tc7x 4 года назад
The concept had been presented to me on some online course, but until this video I didn’t really understand it. Thank you!
@malevip
@malevip 3 года назад
Another way to look at entropy: Measure of distribution of probability in a probability distribution.
@sanadarkia2724
@sanadarkia2724 5 лет назад
can't we just ask one question? is it abc or d ? edit: nevermind, i just figured that 1 bit removes uncertainty of 1/2
@potatocoder5090
@potatocoder5090 Год назад
Brilliant explanation. So simple yet so profound. Thanks!
@swazza9999
@swazza9999 4 года назад
This should have more likes!
@science.20246
@science.20246 6 месяцев назад
قوة في الشرح و وضوح
@jonathan.gasser
@jonathan.gasser 4 года назад
Wow, what a presentation!
@gaofan2856
@gaofan2856 3 года назад
The most beautiful explanation of entropy
@vlaaady
@vlaaady 3 года назад
The most intuitive explanation
@nomann5244
@nomann5244 Год назад
you are truly a genius.
@Puneethmypadi
@Puneethmypadi 3 года назад
Now I understand decision tree properly
@bouzouidjasidahmed1203
@bouzouidjasidahmed1203 2 года назад
Very comprehensible thank you !! it very helpful
@shepbryan4315
@shepbryan4315 4 года назад
Why is the number of bounces the log of the outcomes?
@jayrar6645
@jayrar6645 4 года назад
so just to clarify, is the reason the decision tree for machine B is not the same as for A as you ask less questions overall? and how do you ensure that the structure of the decision tree is such that it asks the minimum number of questions?
@tingwen524
@tingwen524 3 года назад
Great video! I totally understood entropy!
@Chrls5
@Chrls5 3 года назад
Nice!
@YTBxd227
@YTBxd227 5 лет назад
still confused why #outcome=1/pi
@CZRaS
@CZRaS 3 года назад
because you need to "build" a binary tree to simulate bounces. E.g. you have probability p=1/2 (50%). From that outcome = 1/1/2 = 2. If you have p=1/8 (12,5%), you get outcome = 8. From which you can get the log2, which is basically the level on which the value is in the binary tree.
@osobliwynick
@osobliwynick Год назад
Great explanation.
@mohammadrezamoohebat9407
@mohammadrezamoohebat9407 9 лет назад
It was perfect. thx
@최로봇
@최로봇 3 года назад
if it makes us ask less questions, doesn't it mean it provides more information?
@leoyuk-tingcheung3587
@leoyuk-tingcheung3587 8 лет назад
Could anyone help explain why less uncertainty means less information (Machine 2)? Isn't it the other way round? Many thanks.
@TheOmanzano
@TheOmanzano 8 лет назад
there is less certainty in machine 2 because on "average" there will be less questions...meaning after many trials on average there will be 1.75 questions needed to get right result meaning there is less variety, randomness, chaos in machine 2 due to the fact that "A" will be occurring alot more than other letters
@navketanbatra1522
@navketanbatra1522 8 лет назад
+TheOmanzano Yea so since the number of questions we need to ask, to guess the symbol, on an average, is less in machine 2 - so this should imply that machine 2 is giving us 'more' information 'per answer for a question asked' right? I'm really confused at its physical interpretation of information gained.
@navketanbatra1522
@navketanbatra1522 8 лет назад
Right! Got it! So its not that we're getting 'more information per answer', we will be getting same amount of information for each question asked for whichever machine. The fact that we have to ask less number of questions is because there is 'less uncertainty' in the outcome; we already have some 'idea' or 'prediction' for some outcome implying there will be less information gained when the outcome is observed. *phew*
@hirakmondal6174
@hirakmondal6174 6 лет назад
Think of it as a Hollywood film..where a police inspector interrogates a criminal and he must speak truth each and every time. After 175 questions the inspector found out that he knows no more than that, where as when he interrogated other criminal in an adjacent cell he found out that after asking 175 questions he can still answer 25 more.. Now U Tell Me Who Has More Information? . . . . U are welcome!! 8)
@Exhora
@Exhora 6 лет назад
HIRAK MONDAL That was a great example! Thank you so much!!!
@sanjayrakshit8797
@sanjayrakshit8797 5 лет назад
Heckin Shanon
@yudong8820
@yudong8820 3 года назад
Really good one, thanks!
@shelendrasharma9680
@shelendrasharma9680 6 лет назад
Best explaination , salute ....
@btsandtxtloverstraykidzfan3486
@btsandtxtloverstraykidzfan3486 2 года назад
What are some good books on this topic ?
@hirakmondal6174
@hirakmondal6174 6 лет назад
Why outcome is 1/p?
@anirbanmukherjee4577
@anirbanmukherjee4577 5 лет назад
Possibility of outcome=1/number of possibility
@maierdanefan6998
@maierdanefan6998 4 года назад
Thank you!
@juanpablovaca-lago5659
@juanpablovaca-lago5659 2 года назад
Is there a direct analogy for the second and third law of thermodynamics and the information entropy?
@zkhandwala
@zkhandwala 4 года назад
Not to knock this, but I do want to voice an issue that I have with it and every other video I've found on the topic: They always use probabilities that are an integral power of 1/2, which greatly simplifies the explanation, but doesn't generalize well to understanding the majority of real-world scenarios, for which things are not adequately covered by this simplified exposition. I worry, then, that people come away thinking they understand the topic better than they actually do. Of course, I'm open to the perspective of others here...
@daihung3824
@daihung3824 3 года назад
I agree with your statement. I try to have a go at changing the probabilities, say p(A)= 0.45, p(B)=0.35, p(C)=0.15, p(D)=0.05, then 1/p(D)=20, log base 2 of 20 = approx 4.3, however, number of bounces for D should still remain 3 is it?
@ahmedelsharkawy1474
@ahmedelsharkawy1474 5 лет назад
just awesome
@n0MC
@n0MC 7 лет назад
this is wonderful. thank you
@lookstheory2.0
@lookstheory2.0 2 года назад
3:51 their is an error. you have Pd*4 term instead of Pd*2.
@chandragupta2828
@chandragupta2828 5 лет назад
awesome video!
@kawaikaede2269
@kawaikaede2269 2 года назад
cool
@xxxxxx-wq2rd
@xxxxxx-wq2rd 3 года назад
is it valid to say less entropy = less effort required?
@jorgeleirana-alcocer5642
@jorgeleirana-alcocer5642 2 года назад
The equation in 3:48 should result in 2.25 not 1.75 (0.5*1)+(0.125*3)+(0.125*3)+(0.25*4)= 2.25 I think it should have been (0.5*1)+(0.125*3)+(0.125*3)+(0.25*2)
@alexhsia9510
@alexhsia9510 5 лет назад
What do they mean by number of outcomes? Can someone give me an example using the ABCD examples they used?
@temenoujkafuller4757
@temenoujkafuller4757 2 года назад
Yes, I asked myself this question and watch it twice. (5:45) Count the number of branches at the bottom The number of final outcomes = 2^(number of bounces) Therefore, the inverse function of exponent is logarithm >>> The number of bounces = = the number of questions = log_2 (number of outcomes)
@AhmedKMoustafa2
@AhmedKMoustafa2 6 лет назад
great explanation bro :)
@tag_of_frank
@tag_of_frank 4 года назад
Why is entropy and information given the same symbol H, and why does the information formula given in video 5 of playlist include an "n" for the number of symbols transmitted, but this does not?
@pablobiedma
@pablobiedma 5 лет назад
So if I recall correctly, the one with the highest entropy is the least informative one, then the, if a machine generates symbols, and apply the formula for each symbol, which symbol provides the most information? the one with the least amount of bits? how does that make sense, isn't it the one with the highest amount of bits? calculated by p log( 1/p)
@bonanzhao
@bonanzhao 4 года назад
The starting example is calculatedly wrongly, PEOPLE.
@gustavomartins007
@gustavomartins007 2 месяца назад
Muito Bom
@hingaglaiawong7815
@hingaglaiawong7815 2 года назад
at @3:15 I think there's a typo? The last term should be 0.25*2 instead of 0.25*4 I guess.
@FGNiniSun
@FGNiniSun 3 года назад
Hello please why does the number of outcomes at a level equal to 1/probability ?
@ChusKon1
@ChusKon1 3 года назад
Beautiful
@dissdad8744
@dissdad8744 7 лет назад
Good explanation! If I wanted to calculate the entropy with log2, which calculator can do this? Is there an online calculator for this? What would be the best approach?
@ElectricChaplain
@ElectricChaplain 7 лет назад
Hans Franz Too late now, but log2 b = ln b / ln 2 or more generally log 2 b = log base a of n / log base a of 2.
@suliu2933
@suliu2933 6 лет назад
Great video! I can follow it but I have trouble understanding the problem statement. Why "the most efficient way is to pose a question which divides the possibility by half"?
@vandanachandola322
@vandanachandola322 4 года назад
Too late, but maybe because we're trying to ask the minimum no. of questions (and therefore going with the higher probability first)?
@assylblog
@assylblog 5 лет назад
Cool beat
@ArthurShelby-y8s
@ArthurShelby-y8s 10 месяцев назад
👏🏻
@_crispins
@_crispins 6 лет назад
nice!
@Apolytus
@Apolytus 4 месяца назад
In 3.50 you mistakenly have written 0.25*4 instead of 0.25*2.
@ignatutka6202
@ignatutka6202 2 года назад
how come machine's two entropy is more than one? if entropy's maximum is one
@monicarenas
@monicarenas 3 года назад
In minute 3:51, I guess there is a mistake, for p_D, the value is 2 instead of 4, does not?
@joshuaronisjr
@joshuaronisjr 6 лет назад
I don't understand this. Why is it that everytime you move away entropy must go down? Let's say the probabilities are: P(A) --> 0.27 P(B) --> 0.26 P(C) --> 0.24 P(D) --> 0.23 If you use the same encoding he used, you end up getting: 0.27(1) + 0.26(2) +0.24(3) + 0.23(3) = 2.2 Bits on average That is more than 2 bits on average. How did encoding the information by probability help?
@osamabad3597
@osamabad3597 6 лет назад
My question to you is why would you ask each letter individually, instead of dividing them into groups as it was shown here?
@henrikhbaghramyan9119
@henrikhbaghramyan9119 2 года назад
Why do we need to ask 100 times more questions if the number of symbols is 100? There are 4 symbols in the example. Why does this scale by exactly 100?
@youssefdirani
@youssefdirani 4 года назад
4:45 Markoff or Markov ?
@samhe331
@samhe331 3 месяца назад
I think the math at 3:52 is wrong.. should be 0.25 x 2 instead of 0.25 x 4 but the result is right 1.75
@hiteshjambhale301
@hiteshjambhale301 Год назад
Hey there is one mistake at timestamp 1:26 ..... the question should be "is it C?" instead of B
4 года назад
Phụ đề Tiếng Việt ở 4:34 sai rồi, máy 2 sản xuất ít thông tin hơn máy 1
@MNKPrototype
@MNKPrototype 6 лет назад
Did anyone else notice the DEATH NOTE music at 4:40.
@GiuseppeRomagnuolo
@GiuseppeRomagnuolo 4 года назад
I was wondering what was that music, I really like it. Do you have any link I found this ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-hKfKYpba0dE.html following your comment. Is that it? If so can you point me to the right minute? Tnx
@phuocnguyenlethanh3104
@phuocnguyenlethanh3104 8 месяцев назад
the number of bounces is not equivalent to the number of questions asked
@betbola5209
@betbola5209 8 лет назад
como se calcula a entropia de um texto? e o que podemos fazer com isso?
@zainulabydeen2809
@zainulabydeen2809 4 года назад
Can anyone explain ,how the answer become 3/2 in solved example ? Any help will be appreciated
@betoib1504
@betoib1504 6 лет назад
!Órale!
Далее
To mahh too🫰🍅 #abirzkitchen #tomato
01:00
Просмотров 3,4 млн
Solving Wordle using information theory
30:38
Просмотров 10 млн
Что же такое энтропия? [Veritasium]
25:55
Why Information Theory is Important - Computerphile
12:33
Shannon Entropy and Information Gain
21:16
Просмотров 204 тыс.
The Misunderstood Nature of Entropy
12:20
Просмотров 1,2 млн
The Big Misconception About Electricity
14:48
Просмотров 22 млн