Тёмный
No video :(

The Sigmoid Function Clearly Explained 

Power H
Подписаться 1,2 тыс.
Просмотров 101 тыс.
50% 1

Опубликовано:

 

21 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 112   
@atheneus
@atheneus 3 года назад
I'm shocked to see that this channel is so small. Please keep uploading, you're a goldmine.
@Administrator80085
@Administrator80085 4 года назад
Truly CLEARLY explained! Good job. I look forward to seeing the rest of the series.
@DubZenStep
@DubZenStep 2 года назад
This is the clearest explaination ever made. The world needs more people like you my friend. Thank you so much, you are a game changer.
@pierocruz6191
@pierocruz6191 3 года назад
woow, I didn't think I would get the idea in a youtube video thanks!!
@Jake-ms9dr
@Jake-ms9dr 2 года назад
I decided to learn a little bit about the sigmoid so I could understand neural nets slightly better, and I'm glad I did. Very good video.
@nononnomonohjghdgdshrsrhsjgd
@nononnomonohjghdgdshrsrhsjgd 3 месяца назад
Finally someone who understands it. I can`t believe how people lack ground knowledge in plotting functions, but are already studying Maschine Learning. Thank you very much.
@alessandrobelottidev
@alessandrobelottidev 3 года назад
THIS IS THE MOST AMAZING VIDEO ABOUT THE SIGMOID. It really helped me understanding better machine learning @16
@rabistonechoy4656
@rabistonechoy4656 2 года назад
Ah.. because I didn't learn math properly, I couldn't understand why is this working as binary activation. Thank you for explaining.
@guanlunzeng9332
@guanlunzeng9332 3 года назад
Thanks for the video! Very easy and clear to understand the Sigmoid function.
@upperground9543
@upperground9543 2 года назад
This is such a great explanation. So much important information is packed into this one video. Thank you very much for saving my time! keep up the great work. The programming community needs more people like you
@sachinsharma-ur2dq
@sachinsharma-ur2dq 3 года назад
Thanks for a clear and to the point explanation...
@heidik1757
@heidik1757 3 года назад
Best way of explaining it in my eyes. Cheers!
@Kugelschrei
@Kugelschrei 3 года назад
Thank you so much, this really helped me grasp the concept. I always wondered how big inputs (or many inputs) result in 0-1 ranges when leaving the node, since they are all added and multiplied beforehand. I knew the activation function takes care of it, but had no idea how exactly
@du42bz
@du42bz 2 года назад
Epic pfp
@himanshugautam7318
@himanshugautam7318 3 года назад
Amazing explanation. Looks like I will never forget this concept as it was explained so well.
@zwdxff5493
@zwdxff5493 2 года назад
The best explanation ever
@powerh5888
@powerh5888 3 года назад
Follow me on Medium: dr-younes-henni.medium.com/
@hijack80
@hijack80 4 года назад
clear as day. Thank you for the amazing explanation
@ekinkaganozkan
@ekinkaganozkan 3 года назад
This is amazing! Perfectly explained!
@omarespino964
@omarespino964 Год назад
EXCELLENT explanation. Thank you!
@arthurpendragoon574
@arthurpendragoon574 2 года назад
thanks, your video is easy to understand, hope you keeping working on more things like this.
@willbutonline784
@willbutonline784 4 месяца назад
This is fantastically visualized and explained! :D
@legendfpv
@legendfpv 2 года назад
Best video on the topic. Thanks
@GEB-Loop
@GEB-Loop 3 года назад
Great explanation. Thank you!
@amingilani
@amingilani Год назад
You should create more videos. Beautifully explained.
@spawnerbest2718
@spawnerbest2718 2 года назад
Really clear and straightforward explanation, thanks a lot!
@dereksavage8728
@dereksavage8728 3 года назад
Why would anyone dislike? Beautiful presentation and explanation.
@hkrishnan91
@hkrishnan91 3 года назад
Thank you very much. Clear explanations. Cheers!
@belfloretkoriciza5279
@belfloretkoriciza5279 Год назад
Keep it up guys we really appreciate your effort
@Louie_sangalang
@Louie_sangalang 2 года назад
Awesome! Clearly and concisely explained! 👍🏼👍🏼
@waelmikaeel4244
@waelmikaeel4244 3 года назад
This video is just perfect. Thanks a lot buddy.
@powerh5888
@powerh5888 3 года назад
Thanks for watching :)
@mohshas9116
@mohshas9116 3 года назад
Good job I hope to continue
@joshuabattlehammer6058
@joshuabattlehammer6058 3 года назад
This video is amazing, great job!
@aashishaxis5108
@aashishaxis5108 4 года назад
job well done ...thank u for spot on explanation
@ricardogabrielcardozocontr3653
@ricardogabrielcardozocontr3653 2 года назад
Exelente explicación! ¡felicitaciones eres un gran educador!!!!!!
@BeMyMotive
@BeMyMotive 3 года назад
thanks for the detailed explanation!
@SimpleLivingHigherThinking
@SimpleLivingHigherThinking Месяц назад
Thank you so much now the intution for the formula for logistic regression for classification problems became much more clear 😀
@panditjimbawale8850
@panditjimbawale8850 3 года назад
very nice explanation!
@rodrigo10239
@rodrigo10239 3 года назад
Great explanation, thanks a lot!
@powerh5888
@powerh5888 3 года назад
Thanks for watching :)
@amandeepsaha
@amandeepsaha Год назад
Wow nice explained
@tanjimashraf803
@tanjimashraf803 Год назад
Simply yet effectively explained. Please consider making more videos & continue. I visited you channel, that was your great initiative.
@HistoricalPlayer
@HistoricalPlayer 3 месяца назад
Best visual explanation!
@rajqsl5525
@rajqsl5525 5 месяцев назад
Very helpful!! Keep up the good work. Thanks
@stevenwilson5556
@stevenwilson5556 3 года назад
Very cool, thank you.
@StarlitWitchy
@StarlitWitchy 4 месяца назад
Thank you so much for this detailed and informative video!!
@valentinberrios5927
@valentinberrios5927 3 года назад
awesome video! thanks so much for this
@amazingvipul8392
@amazingvipul8392 2 года назад
Awesome explanation, thank you 👍
@Joel-fs5zh
@Joel-fs5zh 2 года назад
thank you, clear, concise, and I liked the graphics
@vadivelan4228
@vadivelan4228 2 года назад
Wow.. Amazing presentation.
@georgezevallos
@georgezevallos Год назад
What a wonderful video!
@ganeshsubramanian6217
@ganeshsubramanian6217 2 года назад
Very well explained, thank you!
@youssefbouraha7026
@youssefbouraha7026 2 года назад
amazing channel, I hope that you can upload more videos
@nyotowijaya7949
@nyotowijaya7949 2 года назад
This is a master piece
@ishantguleria870
@ishantguleria870 2 года назад
amazing explanation
@muslusudurag8338
@muslusudurag8338 3 года назад
Excellent
@karthikm1558
@karthikm1558 Год назад
Waiting for new videos bro. Excellent explanation 😁🤠
@largewallofbeans9812
@largewallofbeans9812 Год назад
The inverse of e^x is not e^-x. That would be the reciprocal. The inverse is ln(x).
@tagoreji2143
@tagoreji2143 7 месяцев назад
Both notations are same right? 🤔
@StarlitWitchy
@StarlitWitchy 4 месяца назад
Well it is the multiplicative inverse
@samirpashayev5946
@samirpashayev5946 2 года назад
that was concise and simple. thank you very much!
@Aklys
@Aklys 3 года назад
This was great. Had heaps of trouble finding something that explains why the function is written the way it is. Thanks. Any ideas why we specifically use Euler's Constant vs any other. Or is that just because it's commonly utilised as a "Natural" exponential function?
@powerh5888
@powerh5888 3 года назад
yes, exponential uses e by default.
@samkplaylists2331
@samkplaylists2331 2 года назад
Wow, thus a very great explanation
@internetuser2399
@internetuser2399 2 года назад
This video was an incredible help, ty for making it. Does anyone know why there's a limit for x=0? I thought limits were only for when x neared impossible values, but x=0 doesn't seem to break anything? My knowledge of calculus is still pretty low. Thanks.
@cognosagedev
@cognosagedev 2 года назад
We take limit at x=0 just to find the threshhold value of sigmoid function so that we can draw the border point or threshhold point on the basis of that value we can predict either our independent variabe lies in 1 category(in this case Cat) or in second category(Dog)). hope you understand.
@kanishk9490
@kanishk9490 2 года назад
loved the explanation mate!
@cocoarecords
@cocoarecords 3 года назад
wow this is perfect man
@GGGGGGGGGG96
@GGGGGGGGGG96 7 месяцев назад
Unfortunately, mathematicians cannot explain mathematics to non-mathematicians 😅. If i would explain the Sigmoid function clearly, I would start with the problem itself and what should be solved, before I start with any numbers and formulas.
@youneszahr9097
@youneszahr9097 3 года назад
Thank you!
@MFM88832
@MFM88832 3 года назад
Also important to put in that the Sigmoid function is a special case of the Logistic Function with the parameters L=1,k=1 and x0=0.
@nguyennam9003
@nguyennam9003 4 года назад
understood. tks
@powerh5888
@powerh5888 4 года назад
My pleasure 😇
@thinkmath4270
@thinkmath4270 5 дней назад
thank you, great explanation.
@md.mohiulislam6516
@md.mohiulislam6516 2 года назад
very good explanation
@entrepreneurways9465
@entrepreneurways9465 2 года назад
he has a nice voice !
@vipnirala
@vipnirala Год назад
Great explanation. Thanks
@pawestenzel3128
@pawestenzel3128 2 года назад
So useful
@Webenefit_Youtube
@Webenefit_Youtube 2 месяца назад
Thank you for the clear explanation.
@nagame859
@nagame859 3 года назад
👍👍👍
@felipecavalcante8419
@felipecavalcante8419 2 года назад
very enlightening, thanks for it
@arfatahmedansari8916
@arfatahmedansari8916 3 года назад
Even after watching it carefully I did not understand it properly can someone tell me the pre-requisites to understand this?
@lordblanck7923
@lordblanck7923 3 года назад
zero, except the last part where he told about classification problem. There is nothing to learn, he told everything in video.
@arfatahmedansari8916
@arfatahmedansari8916 3 года назад
@@lordblanck7923 okay
@atheneus
@atheneus 3 года назад
@@arfatahmedansari8916 yes, you'll need to learn basis - intermediate calculus for this
@legendfpv
@legendfpv 2 года назад
@@lordblanck7923 that's an absolute lie
@alfonsoramirezelorriaga1153
Clearly explained indeed.
@tagoreji2143
@tagoreji2143 7 месяцев назад
Thank you. Good Explanation
@waqarahmad-gm2rc
@waqarahmad-gm2rc Год назад
so in other words, the sigmoid function should be used only WHILE training, then testing should just use a comparison operator to see if the value is more or less than 0.5?
@user-nz7jh8gf3b
@user-nz7jh8gf3b Год назад
hmm inverse of y=e^x is y=ln(x), yea? you were just putting a negative over the x not the inverse. But super nice video of sigmoid func!
@largewallofbeans9812
@largewallofbeans9812 Год назад
I think he meant reciprocal.
@arunthashapiruthviraj2783
@arunthashapiruthviraj2783 9 месяцев назад
Clear explanation
@hanielulises841
@hanielulises841 2 года назад
It's so funny when you realize that videos with superficial concepts of AI have a lot of views, and videos with specifical topics just have a few views
@giovannipugliese3287
@giovannipugliese3287 4 месяца назад
Very good
@kalimismilequest
@kalimismilequest 3 месяца назад
Thank you
@jun0z_700
@jun0z_700 6 месяцев назад
very useful thank you!
@jt1738x
@jt1738x 3 года назад
ahhh dogs and cats. ai youve struck again
@sunitatripathi2891
@sunitatripathi2891 2 месяца назад
Thank you 😊
@basuml7657
@basuml7657 2 года назад
wow, amazing !!
@josephpravinp6662
@josephpravinp6662 2 года назад
This was great. One doubt. How does (lim x -> neg INFINITY) makes (e power minus x) -> INFINITY
@ahaemsinghania8420
@ahaemsinghania8420 10 месяцев назад
when i codded a sigmoid function and inputted -10 the value was greater that 1
@tagoreji2143
@tagoreji2143 7 месяцев назад
Then your code is wrong
@lhadz7290
@lhadz7290 9 месяцев назад
Thank You
@saifqawasmeh9664
@saifqawasmeh9664 8 месяцев назад
Thanks!
@tusharplug
@tusharplug Год назад
👏
@makannafarian1448
@makannafarian1448 3 года назад
Excellent Body :)
@deedetres703
@deedetres703 Год назад
diminishing marginal returns indeed - lets change a few benchmarks 2023!!
@z2a1f44
@z2a1f44 Год назад
bros negative pronounce seems to be raciest lmao😆😆😆😆😆
@samarthsharma6993
@samarthsharma6993 6 месяцев назад
😂bruh u mentioned that, and now i cannot ignore it
@flaviograf4572
@flaviograf4572 2 года назад
Excellent
Далее
The Sigmoid : Data Science Basics
11:34
Просмотров 39 тыс.
ROC and AUC, Clearly Explained!
16:17
Просмотров 1,4 млн
Why Does Diffusion Work Better than Auto-Regression?
20:18
Logistic Regression [Simply explained]
14:22
Просмотров 173 тыс.
All Learning Algorithms Explained in 14 Minutes
14:10
Просмотров 220 тыс.
Watching Neural Networks Learn
25:28
Просмотров 1,2 млн
Why Neural Networks can learn (almost) anything
10:30
The moment we stopped understanding AI [AlexNet]
17:38
Просмотров 934 тыс.
Activation Functions - EXPLAINED!
10:05
Просмотров 114 тыс.