Тёмный
No video :(

03 - Inference with neural nets 

Alfredo Canziani
Подписаться 39 тыс.
Просмотров 11 тыс.
50% 1

Опубликовано:

 

28 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 21   
@andreaziani
@andreaziani Год назад
I compled my Master’s last year, however, I still enjoy watching all of your videos. The content is nothing new to me, but the way you explain it is exceptional. Oftentimes i found mysef saying “oh wow this is a really cool way of explaining this concept”, and many more times I wished I had you as professor. It would have saved me many ours of digging into concept that professors were not explaining properly. Also, the fact that you try to introduce a coherent notation through all your lectures is amazing and is something that the whole research area should consider following, research papers will become immediately more understandable and clear. Thanks a lot for sharing your knowledge Alfredo!
@alfcnz
@alfcnz Год назад
🥰🥰🥰
@ShihgianLee
@ShihgianLee Год назад
I really like how the course is structured in the new semester where we introduce the concept of energy based function upfront. I always regarded loss and cost as the same thing even though cost is used in inference to find the optimal prediction and loss is used in the training of a model. The reason for my confusion is the cost function and loss function sometime are the same, even though they don't have to be. Thank you for sharing the updated course and (new) resources with us!
@alfcnz
@alfcnz Год назад
🥳🥳🥳
@user-co6pu8zv3v
@user-co6pu8zv3v Год назад
Thanks Alfredo. I watched all your videos, as well as many other videos on RU-vid, and of course I already know the content of today's lecture, but still it is very interesting and allows me to look at some problems in a new way. In addition, your lectures are the most intense in terms of mathematics and presentation of the material. It was also useful for me to recall the eigenvalues and eigenvectors of matrices. The StatQuest channel also helps me update my knowledge of statistics. Happy New Year!
@alfcnz
@alfcnz Год назад
🤗🤗🤗
@user-eq3ry9br1z
@user-eq3ry9br1z Год назад
Happy New Year! I wish you joy, success, happiness, and health in the new year. May all your dreams come true and all difficulties stay behind. With the incoming New Year! Currently watching AI2S Xmas Seminar - Dr. Alfredo Canziani (NYU) - Energy-Based Self-Supervised Learning 😀
@alfcnz
@alfcnz Год назад
🥳🥳🥳 My dreams came true! I’m a professor, I’m loved, I’m happy.
@solaris413
@solaris413 Год назад
where are the first two lectures?
@enisten
@enisten Год назад
If this is supposed to follow the first two lectures of Spring 2021, then don't we need a better naming convention going forward to know where the new videos are to be inserted, given that there's already a lecture 3 in Spring 2021?
@alfcnz
@alfcnz Год назад
@@enistenI’ll add all the necessary info on the course website. Now there are 3 hours left of this 2022 year, so I’ll go downtown celebrate in the main square. By the time I’ll advertise this episode on Twitter all the necessary info will be in place. 😊😊😊
@samabd4998
@samabd4998 Год назад
@@enisten please could you send me the best book to practise DL
@ShihgianLee
@ShihgianLee Год назад
I count down to 2023 by watching your video 🙂
@daniellu8104
@daniellu8104 Год назад
Reading other machine learning texts I get the impression that, during inference, a trained model uses a forward pass to make predictions on new data without any further adjustments to its parameters. But you suggest at 21:29 that backprop and SGD are used during inference. I understand that by "inference" you are referring to "finding x_check" rather than "y_tilde=Pred(x)" but this still seems too abstract compared to the usual training-validation approach I am used to. Do you discuss an example of backprop and SGD used during inference in a later video or code example? (will continue watching the rest of these great videos to find out ofc 😁but any direct links would be appreciated)
@alfcnz
@alfcnz Год назад
Yes, in future videos you’ll see how inference implies using GD (not stochastic!). Adjusting weights happens only during training (learning). Inference is about using the model to make predictions. Predictions can be made by minimising a cost while keeping the weights unchanged!
@CyberwizardProductions
@CyberwizardProductions Год назад
enjoying this - totally lost on the terminology but I'll catch up. I got that the java script animated image you show hasn't got 90 degree angles on it but why couldn't it have? would that cause problems of some sort?
@alfcnz
@alfcnz Год назад
My question was: are eigenvectors *necessarily* at 90º? Answer: no.
@uniqued4ve
@uniqued4ve Год назад
I don't understand why this videos title contains "Inference", does it have something to do with the SVD and matrix multiplication?
@alfcnz
@alfcnz Год назад
Sure, that’s how you perform inference with a neural net. Through matrix multiplication and non linearities.
@muthukamalan.m6316
@muthukamalan.m6316 Год назад
Hi Alf, any future plans to do RU-vid live
@alfcnz
@alfcnz Год назад
No time. All these videos come from my class. They are edited for RU-vid but they have not been recorded for that purpose.
Далее
The Most Important Algorithm in Machine Learning
40:08
Просмотров 393 тыс.
SIGMA ENVY IS UNTOUCHABLE 🔥 #insideout2
00:10
Просмотров 3,6 млн
나랑 아빠가 아이스크림 먹을 때
00:15
Просмотров 4 млн
Why Does Diffusion Work Better than Auto-Regression?
20:18
01L - Gradient descent and the backpropagation algorithm
1:51:04
CS480/680 Lecture 19: Attention and Transformer Networks
1:22:38
What happens *inside* a neural network?
14:16
Просмотров 37 тыс.
The moment we stopped understanding AI [AlexNet]
17:38
Просмотров 974 тыс.
01 - Course first part recap, Naïve Bayes intro
1:05:08
Просмотров 3,2 тыс.