Тёмный
Machine Learning TV
Machine Learning TV
Machine Learning TV
Подписаться
This channel is all about machine learning (ML). It contains all the useful resources which help ML lovers and computer science students gain a better understanding of the concepts of this successful branch of Artificial Intelligence.
Bootstrap and Monte Carlo Methods
17:15
Год назад
Understanding The Shapley Value
16:17
2 года назад
Kalman Filter - Part 2
5:01
3 года назад
Kalman Filter - Part 1
8:35
3 года назад
Understanding Word Embeddings
13:22
4 года назад
DBSCAN: Part 2
6:58
5 лет назад
DBSCAN: Part 1
8:21
5 лет назад
Комментарии
@alidogramaci7468
@alidogramaci7468 20 дней назад
Thanks!
@MachineLearningTV
@MachineLearningTV 18 дней назад
Thank you very much.
@deepbayes6808
@deepbayes6808 23 дня назад
32:40 shouldn't this be: log of sum_k instead of sum_k of log?
@Hello-gf2og
@Hello-gf2og Месяц назад
*second time 3:50 😬
@homeycheese1
@homeycheese1 Месяц назад
will coordinate descent always converge using LASSO even if the ratio of number of features to number of observations/samples is large?
@muhammadaneeqasif572
@muhammadaneeqasif572 Месяц назад
amazing great to see some good content again thank yt algorithm keep it up
@stewpatterson1369
@stewpatterson1369 Месяц назад
best video i've seen on this. great visuals & explanation
@pnachtwey
@pnachtwey 2 месяца назад
This works ok on nice functions like g(x,y)=x^2+y^2 but real data often looks more like Grand Canyon where the path is very narrow and very windy.
@sELFhATINGiNDIAN
@sELFhATINGiNDIAN 2 месяца назад
No
@kacpersarnowski7969
@kacpersarnowski7969 2 месяца назад
Great video, you are the best :)
@frielruambil6275
@frielruambil6275 2 месяца назад
Thanks very much, I was looking for such videos to answer my assignment questions and you answered all of them at once within 3 minutes. I salute you,please keep on do more videos to assist the students to pass their exams and assignments.
@NeverHadMakingsOfAVarsityAthle
@NeverHadMakingsOfAVarsityAthle 3 месяца назад
Hey! Thanks for the fantastic content :) I'm trying to understand the additivity axiom a bit better. Is this axiom the main reason why Shapley values for machine learning forecast can just be added up for one feature over many different predictions? Let's say we can have predictions for two different days in a time series and each time we calculate the shapley value for the price value. Does the additivity axiom then imply that I can add up the Shapley values for price for these two predictions (assuming they are independent) to make a statement about the importance of price over multiple predictions?
@somerset006
@somerset006 4 месяца назад
What about self-driving rockets?
@paaabl0.
@paaabl0. 5 месяцев назад
Shapley values are great, but not gonna help you much with complex non-linear patterns, especially in terms of global feature importance
@williamstorey5024
@williamstorey5024 6 месяцев назад
what is text regression?
@yandajiang1744
@yandajiang1744 6 месяцев назад
Awesome explanation
@user-vh9de5dy9q
@user-vh9de5dy9q 6 месяцев назад
Why are the given weights for the distributions, are not really showcasing the distributions on the graph. I mean i would choose π1 = 45, π2 = 35, π3 = 20
@thechannelwithoutanyconten6364
@thechannelwithoutanyconten6364 7 месяцев назад
Two things: 1. What the H matrix is has not been described. 2. One non s1x1 matrix cannot be smaller or greater then another. This is sloppy. Besides that, it is a great work.
@obensustam3574
@obensustam3574 7 месяцев назад
I wish there was a Part 3 :(
@DenguBoom
@DenguBoom 7 месяцев назад
Hi, about the sample has X1 to Xn, do X1 and Xn have to be different? Because you have a previous sample of 100 height from 100 different people. Or it can be like we treated in bootstrap that X1* to Xn* can be drawn randomly from X1 to Xn so basically can draw same height of a single person?
@feriyonika7078
@feriyonika7078 7 месяцев назад
Thanks, I can more understand about KF.
@usurper1091
@usurper1091 7 месяцев назад
7:10
@lingfengzhang2943
@lingfengzhang2943 8 месяцев назад
Thanks! It's very clear
@user-uk2rv4kt8d
@user-uk2rv4kt8d 8 месяцев назад
very good video. perfect explaination!
@sadeghmirzaei9330
@sadeghmirzaei9330 8 месяцев назад
Thank you so much for your explanation.🎉
@laitinenpp
@laitinenpp 8 месяцев назад
Great job, thank you!
@SCramah13
@SCramah13 9 месяцев назад
Clean explanation. Thank you very much...cheers~
@felipela2227
@felipela2227 9 месяцев назад
Your explanation was great, thx
@vambire02
@vambire02 10 месяцев назад
Disappointed ☹️ no part 3
@Commonsenseisrare
@Commonsenseisrare 10 месяцев назад
Amazing lecture of gnns.
@cmobarry
@cmobarry 11 месяцев назад
I like your term "Word Algebra". It might be unintended side effect but I have been pondering it for years!
@rakr6635
@rakr6635 Год назад
no part 3, sad 😥
@vgreddysaragada
@vgreddysaragada Год назад
Great work..
@boussouarsari4482
@boussouarsari4482 Год назад
I believe there might be an issue with the perplexity formula. How can we refer to 'w' as the test set containing 'm' sentences, denoting 'm' as the number of sentences, and then immediately after state that 'm' represents the number of all words in the entire test set? This description lacks clarity and coherence. Could you please clarify this part to make it more understandable?
@GrafBazooka
@GrafBazooka Год назад
i cant concentrate she is too hot 🤔😰
@sunnelyeh
@sunnelyeh Год назад
this video represent meaning that F/A 18 has capability locked UFO!
@thefantasticman
@thefantasticman Год назад
hard to foucus on ppt can any one explain me why ?
@nunaworship
@nunaworship Год назад
Can you please share the link for the books you recommended!
@AoibhinnMcCarthy
@AoibhinnMcCarthy Год назад
Hard to follow not concise.
@jcorona4755
@jcorona4755 Год назад
Pagan porque vean que tiene más seguidores. De echo pagas $10 pesos por cada video
@g-code9821
@g-code9821 Год назад
Isn't the positional encoding done with the sinusoidal function?
@homataha5626
@homataha5626 Год назад
Hello, Thank you for sharing. Do you have the code repositiry? I only learn after I implemented it.
@MachineLearningTV
@MachineLearningTV Год назад
Unfortunately, no..
@because2022
@because2022 Год назад
Great content.
@robinranabhat3125
@robinranabhat3125 Год назад
Anyone. at 31:25, shouldn't the final equation at bottom-right be about minimizing the loss. think that's a typo.
@Karl_with_a_K
@Karl_with_a_K Год назад
I have run into token exhaustion while working with GPT4 specifically when it is giving programming language output. Im assuming resolving this will be a component of GPT5...
@yifan1342
@yifan1342 Год назад
sound quality is terrible
@nehalkalita
@nehalkalita 11 месяцев назад
Turning on subtitles can be helpful to some extent.
@majidafra
@majidafra Год назад
I deeply envy those who have been in your NN & DL class.
@josephzhu5129
@josephzhu5129 Год назад
Great lecture, he knows how to explain complicated ideas, thanks a lot!
@chris-dx6oh
@chris-dx6oh Год назад
Great video
@ssvl2204
@ssvl2204 Год назад
Very nice and conscise presentation, thanks!
@zhaobryan4441
@zhaobryan4441 Год назад
super super clear!