Тёмный

Why We Divide by N-1 in the Sample Variance (Standard Deviation) Formula | The Bessel's Correction 

DataMListic
Подписаться 9 тыс.
Просмотров 8 тыс.
50% 1

In this video we discuss why and when we divide by n-1 instead of n in the sample variance and the sample standard deviation formula, known in the statistics literature as the Bessel's correction. This method corrects the bias in the estimation of the population variance. but only partially corrects the bias in the estimation of the population standard deviation (and that's why I didn't include the standard deviation in this video).
References
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
Expected value of sample variance proof: proofwiki.org/wiki/Bias_of_Sa...
Related Videos
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
Why neural networks are universal functions approximators: • Why Neural Networks Ca...
Bagging vs Boosting: • Bagging vs Boosting - ...
Why we need activations in neural nets: • Why We Need Activation...
Bias variance Trade-off: • Why Models Overfit and...
Neural networks on tabular data: • Why Deep Neural Networ...
Contents
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
00:00 - Intro
00:19 - Population vs Sample Statistics
01:22 - Population vs Sample Biased Variance Example
02:13 - Expected Value of the Biased Variance
03:34 - Bias Source Intuition
04:38 - Degrees of Freedom
05:55 - Outro
Follow Me
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
🐦 Twitter: @datamlistic / datamlistic
📸 Instagram: @datamlistic / datamlistic
📱 TikTok: @datamlistic / datamlistic
Channel Support
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
The best way to support the channel is to share the content. ;)
If you'd like to also support the channel financially, donating the price of a coffee is always warmly welcomed! (completely optional and voluntary)
► Patreon: / datamlistic
► Bitcoin (BTC): 3C6Pkzyb5CjAUYrJxmpCaaNPVRgRVxxyTq
► Ethereum (ETH): 0x9Ac4eB94386C3e02b96599C05B7a8C71773c9281
► Cardano (ADA): addr1v95rfxlslfzkvd8sr3exkh7st4qmgj4ywf5zcaxgqgdyunsj5juw5
► Tether (USDT): 0xeC261d9b2EE4B6997a6a424067af165BAA4afE1a
#variance #standarddeviation #bias #statistics

Опубликовано:

 

23 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 36   
@datamlistic
@datamlistic Год назад
I had a small cold while recording the video, hope my voice didn't sound too weird...
@JoaqoRiquelme
@JoaqoRiquelme 9 месяцев назад
You were great! Thanks for the effort. I learned a lot.
@datamlistic
@datamlistic 8 месяцев назад
@@JoaqoRiquelme Thanks a lot! I am happy to hear you found the video helpful! :)
@pushkargarg4946
@pushkargarg4946 Месяц назад
I have been going through other bs videos people be putting using the same number line example, no one even talking about the DOF reason. Great vid man thanks.
@datamlistic
@datamlistic Месяц назад
Many thanks! Glad you think so about this explanation! :)
@sayangoon2180
@sayangoon2180 6 месяцев назад
Crisp and Clear.Thank you
@datamlistic
@datamlistic 6 месяцев назад
Thanks! Glad you liked it! :)
@AlbinJames
@AlbinJames Год назад
Thanks for this excellent presentation! It's good to see material on such subtle aspects of statistics, and your friendly way of building an intuition around it :) This was way more accessible than the Wikipedia entry and encourages one to continue studying it.
@datamlistic
@datamlistic Год назад
Thank you for your kind words! I am happy to hear that you found this video useful. :)
@iacobsorina6924
@iacobsorina6924 Год назад
Very clean explanation! Thank you!😊
@datamlistic
@datamlistic Год назад
Glad it was helpful! :)
@alnune03
@alnune03 9 месяцев назад
Thank you! Very good excellent explanation!
@datamlistic
@datamlistic 9 месяцев назад
Thanks! Happy you enjoyed it! :)
@gmaxmath
@gmaxmath 5 месяцев назад
Beautiful explanation. I will share this with my students.
@datamlistic
@datamlistic 5 месяцев назад
Thanks! Happy to hear that you liked it! :)
@szngyun8891
@szngyun8891 4 месяца назад
Thanks for your proofing sir
@klevisimeri607
@klevisimeri607 7 месяцев назад
Very nice!
@datamlistic
@datamlistic 7 месяцев назад
Thanks!
@bharathishankar4870
@bharathishankar4870 5 месяцев назад
Very Well Explained
@datamlistic
@datamlistic 5 месяцев назад
Thanks! Glad you liked it! :)
@benlee3545
@benlee3545 2 месяца назад
Hi DataMlistic, not sure you can share your dataset so that I can know how this population mean=175 is calculated. As I did try a few sample and yes they are all below 175 but how do I know this 175 is derived? Also the USL and LSL if provided will be extremely helpful. Thank you in advance.
@datamlistic
@datamlistic 2 месяца назад
Thanks for the question! I simply defined a gaussian with a mean of 175 and a std of 6 and took samples from it, nothing more, nothing less.
@benlee3545
@benlee3545 2 месяца назад
@@datamlistic Sir, noted and thank you very much for your reply.
@tutorchristabel
@tutorchristabel Год назад
Well explained
@datamlistic
@datamlistic Год назад
Thank you!
@user-ot2vx8pd5u
@user-ot2vx8pd5u 10 месяцев назад
Forte bine ecplicat😢
@benlee3545
@benlee3545 2 месяца назад
Hi DataMlistic, I try a sample of 160,155,180,190 and I get a Variance of 204 which is far larger than 36. By the way, when we do sampling, we never know the population mean since the population is so big. So how do I know when to use N-1 or bessel correction?
@datamlistic
@datamlistic 2 месяца назад
Hi Ben Lee Thanks for the question. How did you calculate the variance, what formula did you use? It seems to be really off? To answer your second question, population is more of a theoretical concept in statistics, and you almost never have access to it. I suggest to alawys use the Bessel correction to compute the variance, unless it's clearly stated that the samples you've got are the entire population.
@benlee3545
@benlee3545 2 месяца назад
@@datamlistic Sir, thank you very much.
@datamlistic
@datamlistic 2 месяца назад
Glad I could help! :)
@datamlistic
@datamlistic 2 месяца назад
Glad I could help! :)
@benlee3545
@benlee3545 2 месяца назад
@@datamlistic Yes! That is a big help.
@dennisestenson7820
@dennisestenson7820 3 месяца назад
5:00
@datamlistic
@datamlistic 3 месяца назад
?
@szngyun8891
@szngyun8891 4 месяца назад
Thanks for your proofing sir
@datamlistic
@datamlistic 4 месяца назад
You're welcome! Glad tou liked it! :)
Далее
Kullback-Leibler (KL) Divergence Mathematics Explained
3:21
Dividing By n-1 Explained
14:18
Просмотров 3,9 тыс.
New Recipe for Pi - Numberphile
14:29
Просмотров 268 тыс.
Why Dividing By N Underestimates the Variance
17:15
Просмотров 124 тыс.