Тёмный

A pretty reason why Gaussian + Gaussian = Gaussian 

3Blue1Brown
Подписаться 6 млн
Просмотров 767 тыс.
50% 1

A visual trick to compute the sum of two normally distributed variables.
3b1b mailing list: 3blue1brown.substack.com/
Help fund future projects: / 3blue1brown
Special thanks to these supporters: www.3blue1brown.com/lessons/g...
For the technically curious who want to go deeper, here's a proof of the central limit theorem using Moment generating functions:
www.cs.toronto.edu/~yuvalf/CL...
And here's a nice discussion of methods using entropy:
mathoverflow.net/questions/18...
Relevant previous videos
Central limit theorem
• But what is the Centra...
Why π is there, and the Herschel-Maxwell derivation
• Why π is in the normal...
Convolutions and adding random variables
• Convolutions | Why X+Y...
Time stamps
0:00 - Recap on where we are
2:10 - What direct calculation would look like
3:38 - The visual trick
8:27 - How this fits into the Central Limit Theorem
12:30 - Mailing list
Thanks to these viewers for their contributions to translations
German: lprecord, qoheniac
Spanish: Pablo Asenjo Navas-Parejo
Vietnamese: Duy Tran
------------------
These animations are largely made using a custom Python library, manim. See the FAQ comments here:
www.3blue1brown.com/faq#manim
github.com/3b1b/manim
github.com/ManimCommunity/manim/
You can find code for specific videos and projects here:
github.com/3b1b/videos/
Music by Vincent Rubinetti.
www.vincentrubinetti.com/
Download the music on Bandcamp:
vincerubinetti.bandcamp.com/a...
Stream the music on Spotify:
open.spotify.com/album/1dVyjw...
------------------
3blue1brown is a channel about animating math, in all senses of the word animate. And you know the drill with RU-vid, if you want to stay posted on new videos, subscribe: 3b1b.co/subscribe
Various social media stuffs:
Website: www.3blue1brown.com
Twitter: / 3blue1brown
Reddit: / 3blue1brown
Instagram: / 3blue1brown
Patreon: / 3blue1brown
Facebook: / 3blue1brown

Опубликовано:

 

28 май 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 552   
@3blue1brown
@3blue1brown 10 месяцев назад
I made a video covering a proof of the central limit theorem, that is, answering why there is a "central limit" at all. It's currently posted for early viewing on Patreon: www.patreon.com/posts/draft-video-on-i-87894319 I think the video has room for improvement, and decided to put it on a shelf for a bit while working on other projects before turning back to it. In the meantime, though, if you are curious about why all finite variance distributions will tend towards some universal shape, it offers an answer. Also, you may be interested to know that a Gaussian is not the only distribution with the property described in this video, where convolving it with itself gives a (rescaled) version of the original distribution. The relevant search term here is "stable distributions", though all others will have infinite variance, so don't fit the criteria of the CLT. Often when the CLT doesn't apply, it's because the independence assumption doesn't hold, but another way it can break is if you're starting with one of these infinite variance cases.
@csehszlovakze
@csehszlovakze 10 месяцев назад
please make a post there about the complete software stack you're using to make your videos!
@official-obama
@official-obama 10 месяцев назад
9:08 the second part has "transformatoin" at the top
@mskiptr
@mskiptr 10 месяцев назад
It's kinda hidden, but for people who prefer RSS to mailing lists, it's at /feed
@voidify3
@voidify3 10 месяцев назад
Are the functions at 0:10 stable distributions? When you started talking about rotational symmetry I was expecting you to bring up a visual graph of one of those functions convoluted with itself and explain why it doesn’t have the special property, but instead 5:55 only shows trivial examples and my curiosity about this question remained unanswered. Is it because the functions from 0:10 are stable distributions? If not, why weren’t they shown when they would have been much more interesting demonstrations of the Gaussian’s specialness than trivial examples?
@mydroid2791
@mydroid2791 10 месяцев назад
Grant, could you please make a video on when the discrete can be approximated by the continuous. For example, in this series you showed that discrete random variables added together approach a continuous normal distribution, and you did discrete and continuous convolutions. But what is the error formula one would get by assuming, say, that d6 dice are continuous valued, get your continuous convolution answer, and then take discrete samples of that answer to match the actual discrete nature of d6 dice. I find it much easier to integrate a 'nice' function then it is to simplify a discrete Σ sum.
@justinahole336
@justinahole336 10 месяцев назад
I have to laugh. "Why the normal distribution?" was one of the questions that motivated me to get my M.S. Stat a couple of decades ago. I'm loving this series - it adds so much clarity to what I recall learning.
@AlphaPhoenixChannel
@AlphaPhoenixChannel 10 месяцев назад
That made for a great lunch 😁. In your last video you described the Gaussian as an “attractive point in the space of all functions” and I LOVED that phrasing - really made it make sense. However I don’t do enough real math to realize that could be the foundation of a proof. That’s pretty cool.
@KingDuken
@KingDuken 10 месяцев назад
Agreed! :) I'm at work eating my lunch and people around me sometimes ask, "Oh are you in school?" and I'm like, "Nope, just an engineer like you that likes learning the math that was never taught!"
@nisargbhavsar25
@nisargbhavsar25 10 месяцев назад
The legend is here! 🙏🏽🛐
@MeanSoybean
@MeanSoybean 10 месяцев назад
i also had lunch to this video
@jasonremy1627
@jasonremy1627 10 месяцев назад
After all the cliffhangers, it's nice to get this series all wrapped up so neatly.
@QuantumHistorian
@QuantumHistorian 10 месяцев назад
Wrapped up? He didn't prove the central limit theorem at all. Which is supposedly what this was all about. This video itself barely adds anything at all to the previous ones. Moment generating functions are really not all that complicated - it's high school stuff really. And it gives a much clearer intuition for why a Gaussian is the limit in the central limit theorem: it's the unique probability distribution that has a mean and a standard deviation but no higher moments. In other words it's the simplest* distribution: the one that can be described by the least information. Anything else like skew or asymmetry is "averaged out". Sadly, Grant is so obsessed about representing things visually that he brushes over alternatives that are at times far clearer and more powerful ways of understanding this. * [technically the simplest would be a point distribution were a single outcome has probability 1 and everything else probability 0, but that hardly counts as a distribution. And anyway, it's just a special type of Gaussian with width 0]. EDIT: I got mixed up, replace "moment" with "cumulant" above to correct it. Intuition is the same.
@Redingold
@Redingold 10 месяцев назад
@@QuantumHistorian This series is an excellent demonstration of the idea of limits, not just in that the videos are all about the central limit theorem, but also in that he's tending towards the proof of the central limit theorem without ever actually reaching it.
@lelouch1722
@lelouch1722 10 месяцев назад
@@QuantumHistorian This series of video is clearly not to give a fully technical answer but rather an intuitive view of why it's true. I also agree that the visual "trick" here does not seem to simplify a lot the work given that the integral is already easy to compute using trigonometric change of variable that arise naturally, but maybe i'm biased by my own experience.
@KingDuken
@KingDuken 10 месяцев назад
You can probably argue that the purpose of the cliffhanger is to encourage the viewer to ponder upon a new solution. That's very much the format of his videos. 3Blue1Brown will never tell the viewer the answer but rather allow open-ended interpretation.
@QuantumHistorian
@QuantumHistorian 10 месяцев назад
@@KingDuken That's not even remotely true. He starts with hints, but he almost always gives the full solution at the end. Look at the recent video on chords, or older ones on the chessboard puzzle or the Basel problem.
@capitaopacoca8454
@capitaopacoca8454 10 месяцев назад
I was studying statistics right now and saw this drop
@arvind-venkat
@arvind-venkat 10 месяцев назад
For real. Happened twice now. With binomial and this.
@baidurjyasarkar8854
@baidurjyasarkar8854 10 месяцев назад
They are watching..... There will come a time when they will order us....
@THEMATT222
@THEMATT222 10 месяцев назад
I guess Grant calculated the time of day with the highest probability that the world population would study statistics and then release the video at that time, lol
@gaggy7448
@gaggy7448 10 месяцев назад
Lucky you
@benbockelman6125
@benbockelman6125 10 месяцев назад
Same
@johnchessant3012
@johnchessant3012 10 месяцев назад
I love that this series actually started with the Borwein integrals video. Like, here's a very curious sequence of integrals and here's an interesting concept to explain it, and then five videos later we've dug so far deep into convolutions that we got an intuitive explanation for one of the most important theorems in all of math. It's all interrelated!
@RyanODonnellTeaching
@RyanODonnellTeaching 10 месяцев назад
I like this related explanation: Let X and Y be independent normal random variables, and write S = X+Y for their sum. You can think of S as the dot product of the 2-d vectors (X,Y) and (1,1). As Grant said, the key aspect of the normal random variables is that if you take a draw a pair of them, the result is rotationally symmetric. Now dot product is *also* rotationally symmetric (the dot product between two vectors only depends on their lengths and angle). So the distribution on S would be the same if we rotated (1,1) to any other vector with length sqrt2; in particular, to (sqrt2,0). But (X,Y) dotted with (sqrt2,0) is just sqrt2 X, so we see that S is distributed as (sqrt2 times) a normal random variable.
@novakonstant
@novakonstant 10 месяцев назад
Grant, this has been an absolute masterclass and I genuinely believe it has been your best work so far. Your visualisations have been top notch and it has brought concept space applied to mathematics to a level not seen before, all publicly accessible through RU-vid. You are making mathematics a better field for the entire world. Thanks for your hard work!
@avip2u
@avip2u 10 месяцев назад
One level of brilliance is simply to be brilliant. Another level is to be able to explain and teach. Yet another level of brilliance is to be able to clearly visualize & present the advanced concepts. Wow. No words.
@AzureLazuline
@AzureLazuline 10 месяцев назад
thank you for always making sure to show the "roadmap" before diving into the details! Knowing the broad outline beforehand really makes things easier to follow, and it's something that a lot of other explanatory videos/articles don't bother to do.
@xyzct
@xyzct 10 месяцев назад
Please oh please do a video on the Kalman filter, given how indescribably important it is to our modern existence. The result that the convolution of two Gaussians is a Gaussian is at the heart of the Kalman filter's magic.
@geekswithfeet9137
@geekswithfeet9137 10 месяцев назад
Yes…. So much yes to this, would intersect so many core bits of interest perfectly
@fatitankeris6327
@fatitankeris6327 10 месяцев назад
Now there are so many great explanations on this channel, that it really completes making one understand it.
@cyancoyote7366
@cyancoyote7366 10 месяцев назад
Thank you for bringing us amazing math content Grant! The world needs it! Enjoying my afternoon coffee while watching this one! :)
@diffusegd
@diffusegd 10 месяцев назад
I want to talk about a strange area of probability, where random variables no longer commute: Random Matrices You can define the expectation of a random matrix to be the expectation of its trace, which Essentially is the distribution of its eigenvalues. It turns out, theres a new kind of central limit theorem, known as the "Free central limit theorem" This theorem says that if you have "Freely independent" random matrices, then the mean's eigenvalue distribution tends towards not a normal distribution, but a semicircular distribution. In this probability theory (known as free probability theory), a free convolution exists, which essentially gives the distribution of eigenvalues of X+Y. It turns out the semicircle distribution convolved with itself is another semicircle, much like a normal distribution in classical probability.
@SluisaStoffelen-os5oc
@SluisaStoffelen-os5oc 9 месяцев назад
Is this what we called ''Wigner semicircle law''?
@zakwhite5159
@zakwhite5159 10 месяцев назад
What incredible content. I think like once a year I revisit the same list of statistical oriented content. Between Grant, Richard Mcelreath and Josh Starmer. You really get your bases covered on great stats content.
@estrheagen4160
@estrheagen4160 10 месяцев назад
My abstract brain would have loved showing that Gaussians aee their own convolution via the Fourier Transform, since a convolution in coordinate space is multiplication in momentum space (spot the physicist), and since an FT of a Gaussian is a Gaussian, and the product of two Gaussians is a Gaussian, then the convolution of two Gaussians must also be a Gaussian. But, this is an incredibly satisfying explanation. I'm not left wanting, and after being in the field for nearly a decade, I'm glad to see a frequent concept intuited so cleanly, without the need for arcane notation. ❤
@SquallEstel
@SquallEstel 10 месяцев назад
Thank you very much for your hard work, the result is so pleasing. I’ve discovered your channel with the neural network series and I’ve been enjoying your videos ever since. You rekindled in me the taste for mathematics. Greetings and best regards from France
@Indecisiveness-1553
@Indecisiveness-1553 10 месяцев назад
Congratulations on finally wrapping up this pseudo-series. They’re some of my favorite videos you’ve done!
@whitewalker608
@whitewalker608 10 месяцев назад
Last time, just after I completed IFFT, you dropped a video on continuous convolution. Yesterday, I finished studying Bivariate Normal distribution and you dropped this. Perfect timing for me!
@XxRiseagainstfanxX
@XxRiseagainstfanxX 10 месяцев назад
Binomials with same p are stable under convolution, Poisson distributions as well. The normal distribution is not unique in that regard. Even Cauchy distributions are stable without having any moments or satisfying the CLT. If I had to pick any intuitive reason why the normal distribution shows up in the CLT, I enjoy the fact that the normals cumulants are all zero from the third and that a standardized iid sum’s cumulants hence all tend to those of the standard normal distribution whenever they exist. Also, not all standardized sums converge in distribution to a normal distribution. The limit can be a Gumbel distribution for example as well.
@stick-Iink
@stick-Iink 10 месяцев назад
honestly one of of the best series on youtube
@r4fa3l59
@r4fa3l59 10 месяцев назад
WONDERFUL THANKS FOR INSPIRING AN ENTIRE GENERATION TO GET AND UNDERSTAND THE TRUE BEAUTY OF MATHEMATICS
@user-gv3xt5we1j
@user-gv3xt5we1j 10 месяцев назад
It's been 7 years since I took calculus but this is a great way to revisit those concepts. Thank you!
@genuine8879
@genuine8879 2 месяца назад
Wondeful video! The feel i got (in high school) when i proved something by symmetry always made my day cheer up! Usually these are the most elegants approches to do and the simplest in intuition. Much respect ❤
@jeffmannlein9772
@jeffmannlein9772 10 месяцев назад
always enjoy ur videos. it's nice to watch them and make some connections i might've missed from my time in school
@SilasHaslam
@SilasHaslam 10 месяцев назад
I was wondering about this topic for a while because I didn’t quite get this concept intuitively. And then 3blue1brown dropped this !!
@RobinHillyard
@RobinHillyard 10 месяцев назад
Thanks so much for this--it makes it really clear. And the 3-dimensional model is really a lot more like a bell! (although I know that actual bells have a somewhat different shape). I've been using the concept of combining Gaussian (and uniform) distributions for a while now in my (Scala) library called Number. It keeps track of the error bounds on variables. If keeping track of relative bounds, it's easy: for multiplication, just add the relative bounds together; for functions like e^x or x^p, only very slightly more complex. But, for addition, we need to convert to absolute bounds and use the convolution that you've been describing.
@abhinandanangra
@abhinandanangra 10 месяцев назад
This is what I needed, was working on my project on Central limit theorem in various scenarios.
@Mavhawk64
@Mavhawk64 10 месяцев назад
After having received my Bachelors of Math this past December, I now just realized why we get that sqrt(2) when finding the convolution. The geometric visualization is extremely easy to understand! (I’m sure I derived this back in first year, but I must have forgotten lol)
@Truth4thetrue
@Truth4thetrue 10 месяцев назад
Man I just love your videos Even though I'm way past the time of having genuine will and ability to learn abstract mathematics (living in wartorn hell doesn't really help) but they still give me a sad and lovely nostalgia of the things I love I'm just really glad I learned about your channel and watched it grow without losing any of the great things that made it simply extraordinary
@ahmedkamelkamelo7433
@ahmedkamelkamelo7433 10 месяцев назад
@3Blue1Brown could you please do a series of videos for the time series analysis, I think we need a visual and intuitive explanation for a lot of things there! Thank you 😊
@nkkk6801
@nkkk6801 10 месяцев назад
My friend...i can't thank you enough for the "Essence of linear algebra" videos
@chinchao
@chinchao 10 месяцев назад
You make me finally understand why CLT works, thanks ❤
@vigilantcosmicpenguin8721
@vigilantcosmicpenguin8721 10 месяцев назад
This is a very elegant explanation of what makes the normal curve so special, but it still seems a little [puts on sunglasses] ...convoluted.
@MrMctastics
@MrMctastics 10 месяцев назад
This question popped back into my head yesterday so good timing
@div.6763
@div.6763 10 месяцев назад
Wow it's already in the playlist.. thank you. I wanted to study this for so long
@EPMTUNES
@EPMTUNES 10 месяцев назад
This video is a joyous moment in maths communications, as all your videos are.
@joaodirk
@joaodirk 10 месяцев назад
I would love you extending this series on gaussian distributions and CLT for when there is correlation and/or dependency.
@thanosauce9128
@thanosauce9128 10 месяцев назад
My god he drops a video relevant to the topic I take literally after I finish it
@lorenzoplaserrano8734
@lorenzoplaserrano8734 10 месяцев назад
You are partly the reason I am in love with statistics. Thank you. ❤
@sentinelaenow4576
@sentinelaenow4576 10 месяцев назад
Humanity will always be grateful for your superbly amazing, impactful, and meaningful work. I'm confident your viewers are the best candidates to improve our entire world. It's inspiring to see how your efforts can enhance our understanding of the world and empower people to engage with sophisticated ideas. With your powerful content, you hold the impressive potential to inspire and educate countless individuals, fostering a deeper appreciation for math and its importance in our lives. Such efforts unquestionably play a crucial role in advancing our society as a whole. Thanks a million, Sir 3Blue1Brown. You are genuinely enhancing our world with the most insightful visual content currently available. Please continue for good.
@asseenontv247
@asseenontv247 10 месяцев назад
Awesome video as always! I don't think I've seen you do it yet, but I would love to see you tackle explaining how and why the RSA encryption algorithm works.
@mathemelo
@mathemelo 7 месяцев назад
Simply amazing. That is a very simple yet very rich explanation for the central role played by the normal distribution, and the visuals are amazing as usual! The much more technical way I've always envisioned this is to say that the normal distribution is in some sense "the fixed point of the Fourier transform", and to see the Central Limit Theorem as some kind of "convergence to the fixed point" result through the Fourier transforms. I wonder if the rotational symmetry, which is the key property you use here, can be linked to this "fixed point of Fourier" thing?
@torkelholm6577
@torkelholm6577 10 месяцев назад
So great seeing this video finally come out just as I finished statistics
@SaplingDatree
@SaplingDatree 10 месяцев назад
I've been watching for a while now, Idk why I haven't subscribed till now, but I love your videos. I've always found it fascinating that there is an awesome maths channel with a logo that has relatively the same shape as one of my eyes :) (the brown spot is even in the right place too)
@deltaeins1580
@deltaeins1580 10 месяцев назад
A mailing list! Awesome. I loved Tom Scott doing it and now you too? Amazing!
@salchipapa5843
@salchipapa5843 10 месяцев назад
I've forgotten pretty much everything I learned in college, but one thing I kind of sort of remember is that one way to convolve two functions is to take their Laplace transform and then multiply them. Convolution in the time domain is multiplication in the frequency domain, basically.
@klam77
@klam77 10 месяцев назад
Very very cool. Never learnt convolutions that way!
@thegreatsibro9569
@thegreatsibro9569 10 месяцев назад
After taking AP stats in my high school senior year, I'm glad this series tied up some loose ends of that course. Thanks for all the amazing insight! By the way, I was wondering if you could possibly do a video based on a problem I solved and want to confirm my answers on. It goes like this: You have a line segment of any arbitrary length (it doesn't matter). If you cut it in two random places, what is the probability that the three new segments form a triangle without any excess length left over? Again, I believe I know the answer, but I still feel the need to have my results confirmed. I'm also curious if there is any extra insight that can be provided based on problems such as this one. Again, thanks for making this series, and I can't wait to hear what more spicy knowledge you have in store for us!
@lanog40
@lanog40 10 месяцев назад
FYI there is a small typo at 9:10 in the challenge problem, "The transformatoin of the line..." Thank you visualizing this connection!
@Vikrampratapmaurya
@Vikrampratapmaurya 10 месяцев назад
This channel is one of the most popular chnl in the field of advance maths..❤❤
@mathanimation7563
@mathanimation7563 10 месяцев назад
When you upload video I feel happy because I learn new concept
@strehlow
@strehlow 7 месяцев назад
I'd love to see a video on deconvolution, and its applications. One noteworthy one is basic processing of an image from a telescope. The aperture (typically circular) applies a convolution of a rectangle function to the incoming light. Convolving the resulting image with the inverse of the rect function will remove the distortions caused by the aperture. One strategy on smaller telescopes (especially using film instead of digital sensors) to avoid this is to put a filter on the aperture whose opacity follows a Gaussian, clearest in the center and darkest at the edge. This minimizes the distortions of the image coming through the telescope and avoids the need to process it afterward.
@mynamesholster
@mynamesholster 10 месяцев назад
Hnnnng new 3Blue1Brown video
@Reda-Ou
@Reda-Ou 10 месяцев назад
The entropy explanation is really interesting and makes a lot of sense. As far as I can tell, what it is saying is that: noticing that convolving many different distributions leads to a gaussian distribution, is the same as noticing that repeated sampling the microstate of a system, which is the same as sampling N independent atomic distributions (or approximately independent... or not, depending on your system) of an equilibrium (maximal entropy) system, for large N, will always correspond to the same value of a macrostate variable.
@steffanjansenvanvuuren3257
@steffanjansenvanvuuren3257 9 месяцев назад
"But what is the Fourier Transform? A visual introduction" In that video you showed that the "Centre of weight"(hypotenuse max peak) reaches its peak on the right side, x(real) axis whenever the input sinewave frequency is the same as the rotating frequency. But that only happens if the input sinewave is in phase with the rotation frequency and the rotation starts exactly at x=0 and y=1 on the complex plain. ONLY then does the vector/hypotenuse max peak line up perfectly with the x axis. In reality we have to continuously plot the vector/hypotenuse on a separate graph to get the information we want because on the complex plain the vector/hypotenuse max peak can point in any direction or fall in any quadrant depending on the phase difference between the rotation and the input sinewave signal.
@dakshnarula8036
@dakshnarula8036 Месяц назад
Hey grant, I've been a big follower of your videos. Could you please make a detailed series covering all the topics in combinatorics, statistics and probability
@kirilchi
@kirilchi 10 месяцев назад
Was waiting for continuation of series ❤
@pal181
@pal181 10 месяцев назад
Can't wait for the step 1 explanation. Because this is what I expected from the title.
@HoxTop
@HoxTop 10 месяцев назад
Step 1 explanation is the thing that I have been waiting from this series... The series has been making a point about distributions approaching a normal distribution, and then the finale (or I think this is supposed to be the finale) skips the whole reasoning as to why they approach it in the first place. I hope he will be making a video about it.
@pal181
@pal181 10 месяцев назад
@@HoxTop same
@567kkd
@567kkd 10 месяцев назад
MY Statistics and calculus professor love your video.
@christopherli7463
@christopherli7463 10 месяцев назад
The animation at 7:17 about rotating your radius r to be perpendicularly aligned with the background x-y Cartesian grid is super. Like again animation is providing a very immediate, visual, and physically informed intuition / feeling that if you rotate it one way to align with the grid you'll preserve the area and simplify your computation. Just a small detail but these animations are great thank you very much!
@christopherli7463
@christopherli7463 10 месяцев назад
Like it's almost like the feeling in linear algebra when you change to a natural (eigen) basis to decouple your vectors/directions and then the computation just proceeds orthogonally along their individual axes, not interfering with each other and making the computation much more literally straightforward. So like rotation for a better coordinate system. This was a cool video thanks!
@Me-0063
@Me-0063 10 месяцев назад
Love the videos! They have “re-sparked” my interest in math
@alwayshere6956
@alwayshere6956 10 месяцев назад
Golly id love a little on entropy and it's application here. Important almost even
@ezxalidosman
@ezxalidosman 10 месяцев назад
I really loved the math since the day I started watching your videos not gonna lie!
@satyakiguha415
@satyakiguha415 10 месяцев назад
No one believed that math could be soooooooo beautiful before ur channel was created
@Systox25
@Systox25 10 месяцев назад
I cant wait what you are up to on the new channel. Take care!
@mahadlodhi
@mahadlodhi 10 месяцев назад
Great vid as always
@fuwadhasan7553
@fuwadhasan7553 9 месяцев назад
Majority of your video goes top of my head 😅 as I'm not a good student. But i come here and watch your evey video because of your representation. Thank you 😊
@morgan0
@morgan0 10 месяцев назад
after these videos on convolution, it would be cool to see you do a series on the convolution of filters, and also a video on the complex plane math used to design filters would be cool as well. i’m in that spot where i know the z plane math works but i don’t have a full intuition for why
@shaiguitar
@shaiguitar 10 месяцев назад
Love your videos is an understatement. Speaking of distributions, any chance 3b1b fans can get a video on optimal transport??
@ShivamSharma-ob8ix
@ShivamSharma-ob8ix 10 месяцев назад
Grand was and is my source of inspiration to master mathematics, and to become linguistically accurate! One of my hero ❤.
@Qermaq
@Qermaq 10 месяцев назад
4:57 "For a mildly technical reason you need to divide by sqrt2" - dude that was the first thing I really understood so far! :D But this is more advanced than any math I've studied so I'm ok. I'm glad this seemed obvious to me.
@LiborTinka
@LiborTinka 10 месяцев назад
Reminds me of old days of programming digital image processing, where we used a speed-up trick of repeatedly applying box function to approximate gaussian filter. It was really fast and no floating point math was required.
@adwaitpandey2526
@adwaitpandey2526 10 месяцев назад
Was waiting for this video for a long time
@jeanw4287
@jeanw4287 10 месяцев назад
astounding quality as always
@garancecordonnier230
@garancecordonnier230 3 месяца назад
Thank you for this (once again) amazing video ! I think seeing this property as "the mean of 2 identical gaussians is this same gaussian" gives another intuitive reason for the CLT : there are a lot of links between convergence and fixed points ! Not all fixed points are attractive, but nevertheless finding a fixed point of some process might give you the intuition that this process tends to transform everything into its fixed point (Not really used to talking about math in english so i hope this is understandable)
@monku1521
@monku1521 10 месяцев назад
Thank you for the shoutout at the end! -Daksha
@coreyyanofsky
@coreyyanofsky 10 месяцев назад
at 5:44, being super-clear and specific: the properties that imply a 2D Gaussian are (i) a function x and y only through r, and (ii) independence, expressed as the functional equation g(r) = f(x)f(y) you mention independence earlier and it's on the screen in the upper right but i think it's worth emphasizing that it's essential to the derivation
@michalchik
@michalchik 10 месяцев назад
Here's a little idea that I figured out while thinking about catalysts in my high school chemistry class. There is a mysterious fact that's taught just for road memorization in chemistry, that catalysts lower energy use of activation but they don't shift equilibriums. This is broadly been explained as, if catalysts could shift equilibriums then it would be possible to add and remove catalysts from a reaction chamber, shift the equilibrium back and forth, and essentially build a perpetual motion machine from what you could generate power. This fact was mysterious to me until I realized that the distribution of energies in molecules bouncing around a reaction chamber approaches the normal distribution. The normal distribution. The amount of each reactants and products is only determined by the relative differences in energy and the temperature, not the ease of transition. This would not be true for any other distribution I can think of
@musicarroll
@musicarroll 10 месяцев назад
More generally, linear transformations of Gaussian-distributed random vectors are also Gaussian random vectors. This is one of the main reasons why Kalman filtering works. BTW, convolution is also a bilinear transformation on L^p spaces.
@mohamedbenrokhrokh2293
@mohamedbenrokhrokh2293 10 месяцев назад
Sweety!! The guy with 25% brown 75% blue uploaded
@simplyshocked9908
@simplyshocked9908 4 месяца назад
Very nice! Have you thought about making a video on the concentration of measures phenomenon in higher dimensions?
@FloydMaxwell
@FloydMaxwell 10 месяцев назад
Clicked on the video knowing it would be over my head. Was not disappointed.
@otakultur5624
@otakultur5624 10 месяцев назад
I didn’t watch already but thank you for this video, none of my university teachers ever explained this when studiying probabilities !
@raymondfrye5017
@raymondfrye5017 10 месяцев назад
Because they never apply it.
@JulianCrypto
@JulianCrypto 10 месяцев назад
Impressive work
@_kopcsi_
@_kopcsi_ 10 месяцев назад
I still think that the explanation with Fourier transform, characteristic function (momentum generating function) and entropy makes much more sense than the explanation showed in these 4-5 videos. the self-Fourier transform nature of the Gaussian distribution is the key factor which is naturally related to its special property with convolution.
@NicoSmets
@NicoSmets 10 месяцев назад
Gorgeous videos.
@lucasf.v.n.4197
@lucasf.v.n.4197 10 месяцев назад
I love u sir, ur animations are awesome ❤
@berryesseen
@berryesseen 10 месяцев назад
Another important question is how fast the normalized sum of iid random variables’ distribution converge to that of the Gaussian. One way to quantify this is to ask max_A | P[S in A] - P[W in A]| where S is the sum and W is the Gaussian. This maximum scales as constant/sqrt(N) and is known as the Berry-Esseen theorem. The constant depends on the third moment and the variance. If you need an intuition for why the scaling is 1/sqrt(N), the answer would be the gaps between cumulants of S and W. Their first 2 cumulants are the same by design (mean and variance). Cumulants of W beyond 3rd degree are all zero. Cumulants of S beyond 3rd degree go as c/sqrt(N), d/N, … If you relate this gap with inverse Fourier transform, you will get probability gaps. And that c/sqrt(N) gap in the third cumulant leads to the scaling in Berry-Esseen Theorem. The order of scaling (1/sqrt(N)) is also quite universal. You don’t necessarily need iid. For example it works for independent sums and Markov-1 sums. The dimension can be more than 1. You can even pass the sum through a smooth function.
@TonyWangYQ
@TonyWangYQ 10 месяцев назад
Please also make a video on logistic regression - specifically how the sigmoid function implies probability. I think this would be an interesting topic! Thanks!!
@chiyosa7041
@chiyosa7041 9 месяцев назад
I love it sooooooo much!!! Can you please also do a video on Principal Component Analysis/Regression?
@Sugarman96
@Sugarman96 10 месяцев назад
I don't know if it gets mentioned, but I think one of the beautiful aspects of the Gaussian is that it's an eigenfunction of the Fourier transform, meaning the Fourier transform of a Gaussian function is just a Gaussian function. That's another way to look at why the sum of two Gaussians is a Gaussian, because the convolution turns into a product, so of course the product of two Gaussians is going to remain a Gaussian.
@MrTyty527
@MrTyty527 10 месяцев назад
wayyyyy too beautiful i cannot process
@iankrasnow5383
@iankrasnow5383 10 месяцев назад
Would the same apply to a Laplace transform?
@joeyhardin5903
@joeyhardin5903 10 месяцев назад
@@iankrasnow5383yes
@ghkthILAY
@ghkthILAY 15 дней назад
I think that there must be a connection there with fourier transform and the convolution theorem. under FT a gaussians are mapped to gaussians, when combining that with the convulution theorem it seems like you would get the same result- a joint distribution of 2 normally distributed variables is a normal distribution
@omargaber3122
@omargaber3122 10 месяцев назад
Am I the only one who does not find pleasure in statistical functions, and prefers topics that talk about deterministic functions and definite equations?
@rafaelschipiura9865
@rafaelschipiura9865 10 месяцев назад
This isn't Statistics, it's Probability. There are no random processes at all in the video, everything Grant talked about in this entire series is entirely deterministic.
@rayhanlahdji
@rayhanlahdji 10 месяцев назад
Freshman me would thank you a lot. "Why Normal?" is the most unanswered question throughout my stats undergrad.
@Hailfire08
@Hailfire08 10 месяцев назад
I'm a physics person so maybe this is why, but I really like the Fourier space argument (in Fourier space, a convolution becomes a product; any normalised PDF with finite variance looks like (1-(x/s)^2)e^imx where m is the mean and s is related to the standard deviation; multiply a bunch of things of this form together and by the definition of the exponential, you get out a Gaussian. And then a Gaussian in Fourier space is a Gaussian in real space. It also neatly tells you why functions with infinite variance don't work since they can't be written in that form.)
@mylonoceda
@mylonoceda 10 месяцев назад
When I learned that the area under the Gaussian curve and Γ(1/2) are the same and equal to sqrt(π) I was blown away. It was like seeing an interesting cameo in my favourite movie.
@sherifffruitfly
@sherifffruitfly 10 месяцев назад
Now this is what I call a series finale!
@jacoblojewski8729
@jacoblojewski8729 10 месяцев назад
That the convolution of two Gaussians makes me think of some sort of metric (or psuedo-metric) space of integrable probability functions with finite variance, modded out by equivalence of linear transformations on the dependent/independent variables, then a contraction mapping theorem on them. Then the CLT would be sort of a "global" contraction mapping theorem. Wonder if that's provable or even makes sense, gonna go tinker around!
@ofallcodeandmore5502
@ofallcodeandmore5502 10 месяцев назад
Your videos are the best
Далее
But what is the Central Limit Theorem?
31:15
Просмотров 3,3 млн
Researchers thought this was a bug (Borwein integrals)
17:26
ONE MORE SUBSCRIBER FOR 6 MILLION!
00:38
Просмотров 3,8 млн
The Bubble Sort Curve
19:18
Просмотров 386 тыс.
How Many ERRORS Can You Fit in a Video?!
20:40
Просмотров 438 тыс.
But what is a convolution?
23:01
Просмотров 2,5 млн
Bayes theorem, the geometry of changing beliefs
15:11
How to lie using visual proofs
18:49
Просмотров 3,1 млн
What Gear Shape Meshes With a Square?
31:17
Просмотров 330 тыс.