Тёмный
No video :(

Kernels! 

Machine Learning Street Talk
Подписаться 140 тыс.
Просмотров 20 тыс.
50% 1

Опубликовано:

 

28 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 41   
@thegimel
@thegimel 3 года назад
I love how Yannic takes a step back and explains things using his intuition. very helpful!
@frankd1156
@frankd1156 3 года назад
Everything Yanic speaks is gold...I understand instantly
@MachineLearningStreetTalk
@MachineLearningStreetTalk 3 года назад
I know right 😂
@rockapedra1130
@rockapedra1130 3 года назад
I know! He always asks what I want to know, it’s kinda spooky how good of a communicator he is!
@freemind.d2714
@freemind.d2714 3 года назад
Without Yanic, I can't understand a word
@clarkd1955
@clarkd1955 Год назад
The contribution of all 3 of you was significantly more than the sum of the parts. Very enjoyable, thanks.
@Luck_x_Luck
@Luck_x_Luck 3 года назад
best explanation of kernels I've encountered so far, thanks!
2 года назад
I love your channel! Though I have to admit that I felt a little lost with all the terminology being thrown around when I first watched this video in particular. I decided to delve deeper into kernels and after intensive research, I have created a 6 hours long playlist on Kernel Methods to summarize my current understanding. If anyone wants to have a crash course on kernels in particular, I'd be delighted to welcome you in my comment section. After these countless hours of self study, I can now follow the conversation fully which is such a nice feeling of accomplishment. Thank you for inspiring me to research this topic in-depth!
@machinelearningdojowithtim2898
@machinelearningdojowithtim2898 3 года назад
I loved this conversation with Alex! We already recorded 2 more casual conversions, we will upload them in the coming days
@rockapedra1130
@rockapedra1130 3 года назад
I loved this discussion! The combination of Alex knowing everything in full mathematical generality and Yannic trying to bring it down to the”real world” really helped me! I’m new at this subject, it really helps to walk through a toy problem such as temperature in a room using a simple basis and describing the vectors formed etc. so that it feels less nebulous to begin with. Granted, I’m an engineer so what’s best for me is first show me a simplified version and how it works concretely, THEN abstractify it to death to make it maximally useful. Thanks to all three of you!!! It amazes me that such great content is just “out there” to be found!
@AICoffeeBreak
@AICoffeeBreak 3 года назад
Very helpful video, happy it exists! Perhaps the format could have allowed for some slides here and there, since Alex Stenlake has prepared an explanation in advance. Just to avoid him gesticulating the visualizations. And also verbalizing mathematical examples that are easy to understand when written, harder to follow when just spoken out loud. 😊
@minghanzhu6082
@minghanzhu6082 3 года назад
I really wanted to appreciate the efforts but probably only people with already very good understanding about kernels can handle all these verbal discussions with abstract and repeatedly used words. I see that Yannic tried to make it clearer by asking some clarifying questions, though, which helped a little bit.
@abby5493
@abby5493 3 года назад
Wow! Such good and informative video 😃
@MachineLearningStreetTalk
@MachineLearningStreetTalk 3 года назад
Thank-you Abby!
@dome8116
@dome8116 3 года назад
I love this podcast. Really such a cool idea. I just wanna give some tips that might make it even more better, at least visually. It kind of really annoys to see the bad quality of the people talking. I think it would be so much cooler if everyone would record his camera and audio and afterwards send it to Tim who cuts it together in a way you have it now , where every person is visible at any time, just in way better quality. That way there are also way more options to make the design of the podcast cooler. For example you could put a nice layout over it or something. Also I feel like sometimes it would come in so handy if you would bring some pictures on the screen. A bit like Tim already did where he opened up the papers. It would look so much more professionell to the viewer and Im sure others would like it too. Anyways, I love the show
@swarajshinde3950
@swarajshinde3950 4 года назад
love your videos .
@daryoushmehrtash7601
@daryoushmehrtash7601 3 года назад
This would have been such a nice presentation if Tim didn't distract the flow of the conversation. Yannic tried a few times to recover the underlying goal of the Alex's talk, but failed. I wish this could be redone with the Alex talk on underlying concept and its application to the Yannic's room temperature model as a specific example.
@machinelearningdojowithtim2898
@machinelearningdojowithtim2898 3 года назад
Sorry! Feedback taken on board
@quebono100
@quebono100 3 года назад
Your channel has way to few subscribers. Such good content, im not even a machine learning engineer, just a programer who learn this all stuff at the moment.
@SergeTheGod
@SergeTheGod 3 года назад
Great talk guys! Reminds me why I got into ML in the first place, and reevaluate Bishops book 😅
@bradleypliam110
@bradleypliam110 Год назад
Serge, what is the title of this book? I'd like to find myself a copy.
@SergeTheGod
@SergeTheGod Год назад
@@bradleypliam110 Pattern Recognition and Machine Learning Great book!
@bradleypliam110
@bradleypliam110 Год назад
@@SergeTheGod Thank you for the leg up!!
@raszagal1000
@raszagal1000 3 года назад
Around 39 minutes one bit that is missing is that an inner product of two functions is the integral of the functions multiplied together over the domain of their arguments.
@oblomist
@oblomist 3 года назад
Thank you, that makes more sense now. But the result, when evaluated, should still be a scalar, right?
@raszagal1000
@raszagal1000 3 года назад
@@oblomist in this case yes, not sure if that's true in general.
@shivamraisharma1474
@shivamraisharma1474 3 года назад
Just a naive viewpoint/question here, yannic in his video about linformer mentioned about the JL theorem which multiples a high dimensional data distribution with fixed gaussian distribution to lower dimensions while preserving the distance between data point constant. If kernels are also a distance similarity measure, which also kind of projects data from lower dimension to a certain higher dimension ( rewatching the video again i am at 17 min currently ) so pairwise distance measures between data points seems to be a sorta accurate representation for any distribution and any projecting from higher to lower or vice versa dimension must be focused on preserving the distance measure
@j.dietrich
@j.dietrich 3 года назад
Tim's breakfast bar/kitchen island arrangement is impressive, but the tin of Coffee Mate hurts my soul.
@machinelearningdojowithtim2898
@machinelearningdojowithtim2898 3 года назад
Lol!!! But what are you saying here? 1) You don't like the design on the tin 2) you don't like the manifold of the tin 3) you don't like coffee mate 😂
@wangyifan1468
@wangyifan1468 9 месяцев назад
11:50 where the kernel talk started
@Hawkz1600
@Hawkz1600 3 года назад
Amazing stuff! Also would be cool if you could talk about dimensionality reduction methods to solve the memory inefficiencies of kernel methods with large datasets.
@JI77469
@JI77469 3 года назад
Hawkz1600, it seems that the biggest breakthrough here to fix memory issues is the usage of "random features" to approximate general kernels by random linear kernels. See the paper "Random features for large-scale kernel machines. "
@shivamraisharma1474
@shivamraisharma1474 3 года назад
Top quality content👌👌
@JscottMays
@JscottMays 11 месяцев назад
Solid
@DavenH
@DavenH 3 года назад
"infinite dimensional, or high dimensional, or don't-wanna-compute-able" haha!
@DavenH
@DavenH 3 года назад
"and that's because least-squares is a horrible, blurry loss function" =)
@JI77469
@JI77469 3 года назад
I'd love to know anyone's thoughts on the usefulness/utility of 1) Random Fourier Features (a trick to approximate kernels by certain linear kernels, and thus speed computations up. ) 2) Reproducing kernel Banach spaces (doing kernel methods in a Banach space that promotes sparsity more than doing kernel methods in a Hilbert space setting would, kind of like Lasso regression vs Ridge regression. )
@JRAbduallah1986
@JRAbduallah1986 2 года назад
Why not having a board and writing on it. This make it more interesting. More importantly having fun examples can give audience much better understanding.
@AConversationOn
@AConversationOn 3 года назад
Talking about highly advanced mathematics without notational & visual support is highly silly. There is no one who can understand the english who cannot understand visuals, and many who could only understand the visuals.
@Macatho
@Macatho 3 года назад
Lose the shades.
Далее
#036 - Max Welling: Quantum, Manifolds & Symmetries in ML
1:42:32
I Took a LUNCHBAR OFF A Poster 🤯 #shorts
00:17
Просмотров 7 млн
Deep Networks Are Kernel Machines (Paper Explained)
43:04
"We Are All Software" - Joscha Bach
57:22
Просмотров 25 тыс.
#55 Dr. ISHAN MISRA - Self-Supervised Vision Models
1:36:22
Why Does Diffusion Work Better than Auto-Regression?
20:18
ICLR 2020: Yann LeCun and Energy-Based Models
2:12:12
Просмотров 21 тыс.
CS480/680 Lecture 19: Attention and Transformer Networks
1:22:38
How Cohere will improve AI Reasoning this year
1:00:23
Просмотров 24 тыс.
I Took a LUNCHBAR OFF A Poster 🤯 #shorts
00:17
Просмотров 7 млн