Тёмный

Lecture 1: Introduction to Information Theory 

Jakob Foerster
Подписаться 8 тыс.
Просмотров 341 тыс.
50% 1

Lecture 1 of the Course on Information Theory, Pattern Recognition, and Neural Networks.
Produced by: David MacKay (University of Cambridge)
Author: David MacKay, University of Cambridge
A series of sixteen lectures covering the core of the book "Information Theory, Inference, and Learning Algorithms" (Cambridge University Press, 2003, www.inference.eng.cam.ac.uk/ma...) which can be bought at Amazon (www.amazon.co.uk/exec/obidos/A..., and is available free online (www.inference.eng.cam.ac.uk/ma....
A subset of these lectures used to constitute a Part III Physics course at the University of Cambridge. The high-resolution videos and all other course material can be downloaded from the Cambridge course website (www.inference.eng.cam.ac.uk/ma....
Snapshots of the lecture can be found here:
www.inference.eng.cam.ac.uk/it...
These lectures are also available at
videolectures.net/course_infor...
(synchronized with snapshots and slides)

Наука

Опубликовано:

 

5 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 139   
@mohamedrabie4663
@mohamedrabie4663 Год назад
RIP prof David, you were, and still a great inspiration to us
@eddiehazel1259
@eddiehazel1259 7 месяцев назад
ah man sad to hear 😔
@AvindraGoolcharan
@AvindraGoolcharan 4 года назад
This may be the longest blackboard I've ever seen
@prajwolgyawali6770
@prajwolgyawali6770 4 года назад
Fundamental Problem: Reliable communication over unreliable channel Binary Symmetric Channel: 9:35 Disk Drive Problem: 11:30 Redundancy: 23:00 Repetition Code: 24:35 Decoding is inference Inverse Probability: 31:33 Forward Probability: 40:30 Hamming Code: 47:10 (ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-eixCGqdlGxQ.html%29) Capacity of channel: 58:10 Shannon Noisy Channel Coding Theorem: 58:45 The Weighing Problem 1:00:50
@wampwamp1458
@wampwamp1458 2 года назад
thank you :)
@fjs1111
@fjs1111 2 года назад
Unreliable channel = Unreliable information
@iwtwb8
@iwtwb8 8 лет назад
I was saddened to read that David MacKay passed away earlier this year.
@JTMoustache
@JTMoustache 8 лет назад
what a great teacher.. he will live on :)
@wunanzeng7051
@wunanzeng7051 7 лет назад
He was a very great teacher! Very Articulate!
@yousify
@yousify 7 лет назад
I was shocked when I read your comment. I'm following his lecture and his book on this course, it is a sad news.
@jimmylovesyouall
@jimmylovesyouall 7 лет назад
what a great teacher.. he will live on
@bayesianlee6447
@bayesianlee6447 6 лет назад
RIP For great teacher of human beings. His passion and endeavor for tech would remain for descendant
@IrfanAli-jl7vb
@IrfanAli-jl7vb 5 лет назад
Thank you, Thank you, Thank you for sharing these excellent video lectures. Dr Mackay is amazing at teaching complicated topics. These lectures are great supplement to his excellent book on information theory which has so many excellent plots and graphs that enables one to visualize information theory. Information theory comes alive in pictures. Thank you for sharing these.
@dragonfly3139
@dragonfly3139 7 лет назад
Thank you for these great lectures your memories will live on ... RIP
@the_anuragsrivastava
@the_anuragsrivastava 3 года назад
One of the best lecture of " information theory and coding " I have ever seen....love from India 🇮🇳🇮🇳🇮🇳🇮🇳
@oscarbergqvist4992
@oscarbergqvist4992 5 лет назад
Thank you for a great lecture, looking forward to follow the rest and study the book!
@lesliefontenelle7224
@lesliefontenelle7224 8 лет назад
I am not involved in information technology but this lecturer is making a difficult subject like information theory look so easy. You really must see this...
@shellingf
@shellingf 2 года назад
kinda boring though
@a_user_from_earth
@a_user_from_earth 7 месяцев назад
what an amazing lecture and great teacher. May you rest in peace. You will not be forgotten.
@linlinzhao9085
@linlinzhao9085 5 лет назад
Dr. Mackay is a great explainer. Anyone interested in machine learning and Bayesian statistics can also read his doctoral thesis.
@fireflystar5333
@fireflystar5333 Год назад
This is a great lecture. The design of case problem is really helpful here. Thanks for the lecture.
@george5120
@george5120 5 лет назад
So nice to watch a video like this that does not have music.
@jedrekwrzosek6918
@jedrekwrzosek6918 Год назад
I looove the lectures! Thank you for the upload!
@monazy11
@monazy11 7 лет назад
It was amazing, Finally I learned Shannon noisy channel theorem: there is exist an encoding and decoding system that could reach to the capacity of channel. so error correcting and detecting course is about to learn these encoding and decoding system. wow Amazing lots of thanks to the teacher
@leduran
@leduran 2 года назад
These lectures are great. Thanks for sharing.
@JerryFrenchJr
@JerryFrenchJr 7 лет назад
How am I just now discovering this lecture series??! This is awesome!
@JakobFoerster
@JakobFoerster 7 лет назад
better late than never! glad you are finding it useful
@anantk2675
@anantk2675 5 лет назад
i got it now bro, i am latter than ya ; )
@siweiliu9925
@siweiliu9925 2 года назад
@@anantk2675 I'm later than you, hhhh
@trueDeen911
@trueDeen911 11 месяцев назад
@@siweiliu9925 i am later than you
@abhishekpal5871
@abhishekpal5871 8 лет назад
this really helps. I really wanted to learn information theory. this video series is really easy to understand and awesome.
@RippleAnt
@RippleAnt 9 лет назад
Ah! thanks, was looking for a good series... this is just the one. A great cliffhanger at the end to be precise... ^_^
@qeithwreid7745
@qeithwreid7745 4 года назад
This is fantastic I love it.
@TheNiro87
@TheNiro87 2 года назад
This is great, thank you! The lecture is far more entertaining than just reading the book.
@HeMan-tm8wl
@HeMan-tm8wl 2 месяца назад
whihc book is it?
@baganatube
@baganatube 6 лет назад
I don't mind the slowness. With some decoding, my brain is receiving a cleaner signal with that extra redundancy.
@sadimanesadimane6746
@sadimanesadimane6746 4 года назад
Bagana also does he have to write EVERYTHING
@reyazali2768
@reyazali2768 6 лет назад
awesome example of teaching style
@hyperduality2838
@hyperduality2838 4 года назад
Repetition (redundancy) is dual to variation -- music. Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle. Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics. Randomness (entropy) is dual to order (predictability) -- "Always two there are" -- Yoda. Teleological physics (syntropy) is dual to non teleological physics. Duality: two sides of the same coin.
@ozgeozcelik8921
@ozgeozcelik8921 9 лет назад
awesome! thanks for sharing
@cupteaUG
@cupteaUG 9 лет назад
Any idea about the final puzzle? My answer is 3. :-)
@ciceroaraujo5183
@ciceroaraujo5183 5 лет назад
Thank you professor
@deeplearningpartnership
@deeplearningpartnership 4 года назад
Thanks for these.
@kikirizki4318
@kikirizki4318 4 года назад
Thankyou very much prof David MacKay, Your lecture helps me understand information theory. Btw I live in the country where toilet is not so common
@the_anuragsrivastava
@the_anuragsrivastava 3 года назад
You are from where??
@deepkushagra
@deepkushagra 4 года назад
at 46:50, what is rate? i guess it is (1/no of repetitions) but what does it mean in layman terms
@spring74light
@spring74light 9 лет назад
Cheering, in these days of advanced educational gizmolgy, to see Professor MacKay making extensive and effective use of a stick of chalk and a polychromatically absorbent surfaced board. {Tyneside, England.]
@motikumar1442
@motikumar1442 6 лет назад
Helpful lecture
@ridwanwase7444
@ridwanwase7444 Год назад
the link of getting free book is not working,can anybody tell from where i can get that free book? Thanks in advance
@ncckdr
@ncckdr 9 лет назад
very great theory
@keshavmittal1077
@keshavmittal1077 2 года назад
hey is there any pre-requisite of it
@gadepalliabhilash7575
@gadepalliabhilash7575 19 часов назад
Could anyone explain how is the flip pobability is 10^-13 and and 1% failure is 10^-15. Video reference is at 22:10
@AtharvSingh-vj1kp
@AtharvSingh-vj1kp 7 месяцев назад
I'm a bit curious about when these Lectures were recorded. Was it 2003??
@AlexandriaRohn
@AlexandriaRohn 5 лет назад
00:30 Information Theory invented by Claude Shannon to solve communication problems. Fundamental problem: Reliable communication over an unreliable channel. e.g. [Voice->(Air)->Ear], [Antenna->(Vacuum)->Mars Rover], [Self->(magnetized film)->Later Self] 04:00 Received signal is approximately equal to the transmitted signal because of added noise. 05:45 What solutions are there for having received and transmitted signal be the same? Either physical solutions or system solutions. 07:30 Source message -> [Encoder] -> Coded transmission -> [Channel (noise introduced)] -> Received message -> [Decoder] -> Best guess at original source 08:45 Encoder is some system that adds redundancy. Decoder makes use of this known system to try to infer both the source message and n.
@driyagon
@driyagon 3 года назад
can someone explain how to solve the homework problems?
@palfers1
@palfers1 5 лет назад
Genius camera work
@rafaelespericueta734
@rafaelespericueta734 5 лет назад
Indeed so. It's so frustrating and irritating when the camera focuses on the lecturer when you really want to look at the slide with plots and equations.
@vedantjhawar7553
@vedantjhawar7553 3 года назад
Hi, I just wanted to ask what was meant when at 24:20 when he says that "1 is the same as 5, and if there was a 4, there would be a 0."
@MN-sc9qs
@MN-sc9qs 3 года назад
1 and 5 are both ofd so assigned 1, and 4 is even so assigned 0.
@vedantjhawar7553
@vedantjhawar7553 3 года назад
@@MN-sc9qs Thanks.
@aritraroygosthipaty3662
@aritraroygosthipaty3662 5 лет назад
38:19 I am unsure of the 1/2 in the denominator According to my calculations P(r=011) = (1-f)f^2+f(1-f)^2 This is done by the sum rule. The numerator should have 1/2 due to the P(s=1) term My final answer is P(s=1|r=001) = (1-f)/2 Could anybody help me with my concerns?
@dlisetteb
@dlisetteb 3 года назад
When you add up the probability of r given s=1 and given s=0, you must include the probability of each of those events. It results as P(r=011) = P(r=011, s=0) + P(r=011, s=1) P(r=011) = P(r=011/s=0) * P(s=0) + P(r=011/s=1) * P(s=1) P(r=011) = (1-f)f^2 * 1/2 + f(1-f)^2 * 1/2
@mustafabagasrawala7790
@mustafabagasrawala7790 6 лет назад
I'm fairly new to this. What is a "flip" ?
@aritraroygosthipaty3662
@aritraroygosthipaty3662 5 лет назад
a flip is to change the bit from a 1 to 0 or a 0 to a 1.
@arthurk7270
@arthurk7270 7 лет назад
I'm a bit confused. If we're trying to estimate the mean of the Binomial distribution, wouldn't we use the sigma/sqroot(n) formula for the +- bound? In other words, the mean +- the standard deviation of the estimator: 1000 +- 30/sqroot(10000) = 1000 +- 0.3? My statistics is a bit rusty.
@JakobFoerster
@JakobFoerster 7 лет назад
Arthur K thanks for the comment. Which part of the lecture are you referring to?
@arthurk7270
@arthurk7270 7 лет назад
Hi. I was referring to the discussion at 14:00.
@JakobFoerster
@JakobFoerster 7 лет назад
Arthur K I believe you are confusing the standard deviation of the sample mean rate with the standard deviation of the mean total count. The standard deviation of the mean rate does indeed drop as 1 / (N)^0.5, while the standard deviation of the total count increases by (N)^0.5. Since we care about the total number of bits flipped, it's about the total count rather than the rate. You can see that multiplying the standard deviation of the rate, 1 / (N)^0.5, with the total number, N, results in a standard deviation of the total ~(N)^0.5. Please let me know if that clarifies.
@Omar-th5vv
@Omar-th5vv 2 года назад
I'm new to information theory. At 4:24 why is the received signal is "approximately" and not "equal" to the transmitted signal + noise? like what else is there other than the transmitted signal and the noise? Thanks for sharing such helpful lectures.
@caribbeansimmer7894
@caribbeansimmer7894 2 года назад
It gets extremely difficult trying to model everything that affects the signal, but it's relatively easy to add noise to the signal, but not just noise but an assumption that the noise is additive (hence the plus sign), white, and gaussian( normally distributed). As you would know from statistics, a normal distribution has nice properties for estimation. This stuff gets really complex but I guess you get the idea. Note, we can make other assumptions about the characteristics of noise
@nishantarya98
@nishantarya98 Год назад
Received = Transmitted + Noise makes a *lot* of simplifying assumptions! In the real channel, the noise might be multiplied, not added. The transmitted signal might go through weird transformations, like you might receive log(Transmitted), or it might be modeled as a filter which changes different parts of your Transmitted signal in different ways!
@Handelsbilanzdefizit
@Handelsbilanzdefizit 8 лет назад
I would use SUDOKUS for communication. Only few numbers have to be transmitted correctly, and the other numbers/information can be restored by the decoder :-D
@Hastur876
@Hastur876 5 лет назад
The problem is that only the sudoku numbers that you transmit count as information: the rest of the numbers are constrained by having to follow the rules of Sudoku, and can't be any numbers you want, thus they can't be information. So you're still having to get 100% data transmission.
@terrythibodeau9265
@terrythibodeau9265 2 года назад
I am curious to know how Shannon would have interpretted the internet as part of his theory... noise perhaps
@carloslopez7204
@carloslopez7204 4 года назад
What are the requirements to understand this lecture?
@AndreyAverkiev
@AndreyAverkiev 6 месяцев назад
As it says the only piece of Mathematics is binomial distribution 16:08
@dr.alaaal-ibadi8644
@dr.alaaal-ibadi8644 3 года назад
I like this channel. I'm already teaching this topic for student in Iraq. In Arabic.
@artmaknev3738
@artmaknev3738 10 месяцев назад
After listening to this lecture, my IQ went up 20 points!
@christopheguitton7523
@christopheguitton7523 10 месяцев назад
J'ai compris pourquoi répéter 3 fois la même information de suite à mes enfants n' était pas forcément efficace :-)
@Rockyzach88
@Rockyzach88 2 года назад
"People need 20 GB drives nowadays" Laughs in Call of Duty
@papatyavanroode2329
@papatyavanroode2329 2 года назад
3/8 3/8 2/8 cake and siblings That's the most equal way to share In the 3rd cake they are most close to equal 4/82/8 2/8 is far behind equal
@Hussain1Salman
@Hussain1Salman 7 лет назад
Thanks for the lectures. I am just curious about what happened to lecture 2. It says it was deleted.
@JakobFoerster
@JakobFoerster 7 лет назад
Hi Hussain, thanks for catching this. I have reached out to youtube support to find out. Hopefully will be resolved soon.
@JakobFoerster
@JakobFoerster 7 лет назад
Hi Hussain, Apparently there was a bug in the youtube system and they deleted it by accident. The video is back online now.
@sahhaf1234
@sahhaf1234 Год назад
towards 56:00 he uses the term "bit error" and "block error" but doesnt define them properly..
@oyindaowoeye467
@oyindaowoeye467 10 лет назад
I wonder if he was making a joke when he said "Mr. Binomial" or if he actually meant to say "Bernoulli".
@vik24oct1991
@vik24oct1991 6 лет назад
he is being sarcastic i think , he finds it silly that distributions are named after people who discovered them while it would have been more logical if they were named after what they represent (which binomial distribution does).
@PierrePark
@PierrePark 4 года назад
@@vik24oct1991 he doesn't find it silly, he is himself just being silly, because so many things are named after someone he's joking and extending that to binomial
@derekcrone4679
@derekcrone4679 10 лет назад
is dis how u mak a plane
@brandnatkinson5981
@brandnatkinson5981 10 лет назад
K
@GOODBOY-vt1cf
@GOODBOY-vt1cf 2 года назад
9:50
@GOODBOY-vt1cf
@GOODBOY-vt1cf 2 года назад
15:52
@GOODBOY-vt1cf
@GOODBOY-vt1cf 2 года назад
34:43
@dharmendrakamble6282
@dharmendrakamble6282 4 месяца назад
🎉
@turkiym2
@turkiym2 8 лет назад
Was Mr. Bionomial a joke that fell on dead ears or was it a genuine confusion with Bernoulli?
@MarkChimes
@MarkChimes 8 лет назад
+omniflection I laughed when I heard it. I take it was just a very dry joke.
@yltfy
@yltfy 8 лет назад
+Mark Chimes Hahah. Me too. This is when I clicked the pause and check the comments...
8 лет назад
+Mark Chimes Well hello Mr. Chimes :)
@forheuristiclifeksh7836
@forheuristiclifeksh7836 4 месяца назад
2:00
@minglee5164
@minglee5164 4 года назад
I have read the extraordinary book.
@pauldacus4590
@pauldacus4590 7 лет назад
You know he's a jokster cuz at 1:00:13 the slide says his textbook weighs "~35 lbs".
@rarulis
@rarulis 7 лет назад
XD, that's the price. 35 british pounds.
@izzyr9590
@izzyr9590 5 лет назад
im learning this at school... yet I'am here watching a lecture on RU-vid ... I dont know why ... I should pay attention in my class I guess.
@javatoday5002
@javatoday5002 5 лет назад
you're definitely exercising inference
@circlesinthenight3141
@circlesinthenight3141 7 лет назад
rip david
@dharmendrakamble6282
@dharmendrakamble6282 2 года назад
Information theory
@afarro
@afarro 4 года назад
I was able to achieve f=0.1 for this video with x1.5 speed ...
@stevealexander6425
@stevealexander6425 7 лет назад
Good lecture except for the highly distracting camera work.
@pandasworld4168
@pandasworld4168 6 лет назад
hehe if you re like me, then you have that exam in one week
@GSSIMON1
@GSSIMON1 8 лет назад
the fact is the received signal is not identical to sent signal due to corruption and distortion in the signal, in a process , so how much of the the original signal is received ,what would be the measurement in what unit ,,,,,thats why i do drugs ,and dont give a damm !
@javatoday5002
@javatoday5002 5 лет назад
measurement would be in bits and I am not joking
@MDAZHAR100
@MDAZHAR100 5 лет назад
Binomial is more related to probability topics. Bernoulli is about hydraulics.
@JakobFoerster
@JakobFoerster 5 лет назад
en.wikipedia.org/wiki/Bernoulli_distribution
@user-rl7wg8tp8w
@user-rl7wg8tp8w Год назад
channel immigration
@tonewreck1
@tonewreck1 3 года назад
Profesor McKay is certainly extremely competent in the subject but this really is the most un-intuitive way of introducing information theory. We are given the answer right from the start and work our way backwards to see that it is an effective system, instead of trying to understand the problem and find the adequate solution. We are never trying to understand the nature of the problem but instead made to test the effectiveness of the solution. Typical of classical academic philosophy. Let's make knowledge as boring and abstruse as possible so the riff raff is kept out of our little club!
@vedantjhawar7553
@vedantjhawar7553 3 года назад
Do you know of any sources to check out that might teach the content in the method you are talking about? Thank you.
@tonewreck1
@tonewreck1 3 года назад
@@vedantjhawar7553 try this ...ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-PtmzfpV6CDE.html
@vedantjhawar7553
@vedantjhawar7553 3 года назад
@@tonewreck1 Thank you. I took a look at it and felt that it described the act of measuring information very thoroughly. Are there any videos you recommend for learning about system solutions for transmission errors?
@tonewreck1
@tonewreck1 3 года назад
@@vedantjhawar7553 I am glad you found it helpful. I suggest to see the other videos in this series. Once you get a feel for what information theory is really about, you can go back to MacKay for error correction and all the nitty gritty, he does the details very effectively.
@renduvo
@renduvo 2 года назад
@tonewreck1 For some reason I'm unable to see the reply in which you've suggested the source that @Vedant Jhawar has requested. Could you please share the source again?
@PseudoAccurate
@PseudoAccurate 8 лет назад
This is painfully slow. I'm sure the information is fantastic but you have to watch him write everything he says on the chalkboard. When I got to him telling the class to discuss how much 10% of 10,000 is I couldn't take it anymore. Can anyone suggest a similar lecture that moves more quickly?
@Meeran
@Meeran 8 лет назад
use the youtube speed feature, set it to 2x, done
@PseudoAccurate
@PseudoAccurate 8 лет назад
Lol, nice, good idea.
@unorthodoxresident7532
@unorthodoxresident7532 8 лет назад
Lectors goal ultimatly is to share information and for audience to absorb as much as possible. There are different ways how people can do that and for each it's different. Some people are better at listenening others better absorb visual representation (either looking at chalkboard or writing notes themselves) :)
@PseudoAccurate
@PseudoAccurate 8 лет назад
That's completely understandable - that's what it was like when I went to school. It's just that most lectors now give you the notes themselves so they don't have to take the time to write much and can spend the lecture explaining the material.
@MlokKarel
@MlokKarel 8 лет назад
Now the question wasn't aimed at the 10% of 10k but, rather at the +- part, i.e. std.dev, IMO. Did you get that correctly as well?
@seweetgirlnay
@seweetgirlnay 3 года назад
Where in the fock am I?
@jabbatheplutocrat1074
@jabbatheplutocrat1074 6 лет назад
There is no information here and when will it be realized?Never!!!
@csaracho2009
@csaracho2009 4 года назад
In the first example, p is equal to f? p should be 1-f = 0.90
@abdullahmertozdemir9437
@abdullahmertozdemir9437 11 месяцев назад
p is equal to f, and q is equal to 1 - f. Since we assumed f to be 0.1, 0.1 * (1 - 0.1) = 0.09 thus giving us the variance of 900 after multiplying 10.000 by 0.09
@GOODBOY-vt1cf
@GOODBOY-vt1cf 2 года назад
13:24
@GOODBOY-vt1cf
@GOODBOY-vt1cf 2 года назад
32:13
@alyssag8099
@alyssag8099 4 года назад
8:22
Далее
Claude Shannon - Father of the Information Age
29:32
Просмотров 353 тыс.
ТЫ С ДРУГОМ В ДЕТСТВЕ😂#shorts
01:00
Solving Wordle using information theory
30:38
Просмотров 10 млн
Why Information Theory is Important - Computerphile
12:33
Entropy in Compression - Computerphile
12:12
Просмотров 391 тыс.
On Quarks and Turbulence by David Tong
1:29:04
Просмотров 46 тыс.
Новые iPhone 16 и 16 Pro Max
0:42
Просмотров 2,4 млн
КРУТОЙ ТЕЛЕФОН
0:16
Просмотров 7 млн
iPhone socket cleaning #Fixit
0:30
Просмотров 18 млн