Тёмный

Embeddings - EXPLAINED! 

CodeEmporium
Подписаться 128 тыс.
Просмотров 8 тыс.
50% 1

Опубликовано:

 

26 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 22   
@contactdi8426
@contactdi8426 7 месяцев назад
Thanks a lot Ajay for such amazing informational content. Just STOP saying the awkward QUIZZZ time.. whole focus /mood goes away
@justinwhite2725
@justinwhite2725 7 месяцев назад
It's annoying, but it also gives a break between sections that would otherwise blue together.
@walterbaltzley4546
@walterbaltzley4546 7 месяцев назад
I agree - I find the particular tone and pitch he uses when saying that to be painful (it literally hurts my ears). The transition from learning to review is a good idea; the execution can be improved.
@mehdicharife2335
@mehdicharife2335 20 дней назад
I don't think that the issue is computers only understanding numbers. Even numbers are not directly understood by computers and need to be represented via combinations of "1"s and "0"s. Perhaps the issue is more related to the fact that neural networks and similar models can't directly deal with non-numerical data, and hence the need for a numerical representation before any training can take place. You make a great point about the higher computational cost of using natural or default numerical representations for items like words and images, which explains the need for an 'embedded' representation.
@youngsci
@youngsci 7 месяцев назад
I very much enjoyed every video you made till now. Your explanation has always been extraordinary, but please stop saying "Quiz Time" 😂😂
@markchen8893
@markchen8893 2 месяца назад
Great video! Thank you so much! It makes things easier for someone who just started learning ML.
@punk3900
@punk3900 5 месяцев назад
Genius presentation! Thanks! Keep up your excellent work!
@justinwhite2725
@justinwhite2725 7 месяцев назад
5:06 techically both B and C are correct here. I guess i would say C is the primary and B is a nice (but necessary) sode benefit.
@yolemmein
@yolemmein 4 месяца назад
Very useful and great explanation! Thank you so much!
@EobardUchihaThawne
@EobardUchihaThawne 7 месяцев назад
my method to learn new words in vocab is actually train the pretrained model using transformers
@justchary
@justchary 5 месяцев назад
this is very good. thank you!
@sharjeel_mazhar
@sharjeel_mazhar 4 месяца назад
Can you please make a video that showcases how we can generate custom word embedding on a custom dataset from scratch? Without using anything pre-built? Say IMDb dataset? and then later load them to train a classification model?
@slitihela1860
@slitihela1860 7 месяцев назад
can you prepare a video for Double Q-Learning Network and Dueling Double Q-Learning Network please
@katzenschildkroete
@katzenschildkroete 7 месяцев назад
C, B, A
@CodeEmporium
@CodeEmporium 7 месяцев назад
Ding ding ding! I agree with your answers!
@MannyBernabe
@MannyBernabe 7 месяцев назад
Fun! Thank you!
@x_avimimius3294
@x_avimimius3294 7 месяцев назад
Hi I have an id about an ai based podcast . Here I want to create ai as the main frame of the podcast . Can you guide me on this ?
@sagardesai1253
@sagardesai1253 7 месяцев назад
Thanks for video, you explain things in different difficulty level, that works. The quize and stuff is not working, for me breaks the flow of the content.
@sudarshanseshadri2144
@sudarshanseshadri2144 12 дней назад
C, B, A
@AnA-xx1vx
@AnA-xx1vx 7 месяцев назад
4:30 What was this??😂
@CodeEmporium
@CodeEmporium 7 месяцев назад
Why it’s everyone’s favorite time Quiiiiiz Timmmmmmee, of course!
@AnA-xx1vx
@AnA-xx1vx 7 месяцев назад
@@CodeEmporium yen Anna
Далее
Transfer Learning - EXPLAINED!
16:22
Просмотров 4,6 тыс.
What are Word Embeddings?
8:38
Просмотров 10 тыс.
We finally APPROVED @ZachChoi
00:31
Просмотров 2,2 млн
RAG - Explained!
30:00
Просмотров 1,4 тыс.
Informer embeddings - EXPLAINED!
24:59
Просмотров 1,6 тыс.
The Reparameterization Trick
17:35
Просмотров 20 тыс.
Why Neural Networks can learn (almost) anything
10:30
NLP with Neural Networks | ngram to LLMs
13:21
Просмотров 2,9 тыс.
Embeddings: What they are and why they matter
38:38
Просмотров 23 тыс.
Hyper parameters - EXPLAINED!
16:32
Просмотров 1,8 тыс.
ML Was Hard Until I Learned These 5 Secrets!
13:11
Просмотров 306 тыс.
Vectoring Words (Word Embeddings) - Computerphile
16:56
We finally APPROVED @ZachChoi
00:31
Просмотров 2,2 млн