Тёмный

Word Embeddings 

macheads101
Подписаться 32 тыс.
Просмотров 157 тыс.
50% 1

Word embeddings are one of the coolest things you can do with Machine Learning right now.
Try the web app: embeddings.mac...
Word2vec paper: arxiv.org/abs/...
GloVe paper: nlp.stanford.e...
GloVe webpage: nlp.stanford.e...
Other resources:
www.aclweb.org/...
en.wikipedia.o...

Наука

Опубликовано:

 

22 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 215   
@oscarmvl
@oscarmvl Год назад
This is so relevant in 2023, timeless explanation!
@sau002
@sau002 6 лет назад
You spent considerable no of minutes to explain the nature of the problem before presenting the solution to the problem - I like that approach.
@amreshgiri4933
@amreshgiri4933 6 лет назад
You're a genius. I was struggling to understand the word embedding concept through Stanford University videos. Your explanation and pace is much better. Thanks. Keep making such videos.
@panchoingham
@panchoingham 7 лет назад
Dude, seriously, thank you. I am not a CS major but I got absurdly interested in AI during my last years of college. It's inspiring to see your self-learnt enthusiasm and it gives me strength to follow my interest. Keep up the good work and thank you! Cheers from Buenos Aires, Argentina
@wibulord926
@wibulord926 2 года назад
cc
@myfolder4561
@myfolder4561 8 месяцев назад
Glad to have come across this while looking for materials to learn about word embeddings to understand how text prompts work in stable diffusion text to image models in 2023. You're a great teacher. A lot of videos on this topic across yt is full of jargons without clear explanation. in 2023 there's still tons of relevance of this video with where the current state of technology is
@longship44
@longship44 4 года назад
You are very good at explaining pretty complex concepts. I appreciate the time you took to do this, it was very informative.
@salutoitoi
@salutoitoi 3 года назад
I recently started learning NLP, and the part of word embeddings was just not clear at all. It makes sense now. Thank you a lot ! You won a subscriber
@charlieangkor8649
@charlieangkor8649 3 года назад
good lecture, includes important information like what he finds cool, that it’s the best example what can be done etc. that allows the listener to organize information hierarchically. not like some university lectures, where just a monolithic dump of text is flowing out and we don’t know what is important to remember and what not.
@Alkis05
@Alkis05 3 года назад
More generally, word2vec is nothing more than graph2vec. Sentences can be seen as random walks in the english-language graph, in which each word is a node and every world is connected to other words. The strength of this connections depend on how frequent they appear on the same context. Seen this as graphs allow you to run network analysis and see what other kinds of information you can extract from it. By doing it right, you might even be able to estimate the connections for words that didn't appear in the training set and try to update the model to make it better. Or use the word embedding and try to embed sentences and see how that goes.
@fhypnos912
@fhypnos912 2 года назад
True genius knows how to explain a complex concept in a really simple and intuitive way. Solut.
@sreeramv112
@sreeramv112 5 лет назад
For someone who knows nothing and wants to know everything in ML, this is the simply awesome explanation.
@AshwinVel
@AshwinVel 6 лет назад
I honestly think this is a really good explanation to word embedding. Breaks the nitty gritty involved in word2vec and co-occurrence. I’ve read a couple article and watched a few videos but by far yours is the easiest to comprehend. Thank you so much. Cheers from Malaysia!
@meijiishin5650
@meijiishin5650 Год назад
Fun fact: This guy went on to work at OpenAI and is one of the creators of DALL-E 2.
@GodofStories
@GodofStories Год назад
Haha nice, as soon as I saw him speak for 5 seconds, and saw the timestamp of 5 years ago. I typed this -"If this guy isn't already a founder of a leading company in this AI wave, i'll be disappointed. but hey most of the smartest people don't always see success. And fame, money isn't everything." Glad to see I was right haha. I was like there is a high probability, considering this was 5 years ago, and if nothing else in the Universe interfered with this guy's life trajectory just based on the way this guy talks, and looks which basically shout young and motivated, or hungry should mean he is one of the ls a big time guy now.
@LokeshSharma-me5pg
@LokeshSharma-me5pg Год назад
no wonder a man like him can do the job...
@artemkoren9582
@artemkoren9582 6 лет назад
I've gone through several embedding explanations until I arrived at this one. Well done, finally all pieces make sense. Thanks!
@12sandy345
@12sandy345 6 лет назад
An Exceptional Lecture! What I loved about it is its focus on implementation and results which really helps build good intution around it, to garner interest and dig deeper into math details later which most us are boggled before knowing how powerful/useful the results are. Thank you again.
@fiddlepants5947
@fiddlepants5947 5 лет назад
Humble, concise, brilliant... Subscribed!
@blenderpanzi
@blenderpanzi 7 месяцев назад
8:08 Thank you! This explained the missing piece to me. Multiple other videos on that topic where missing this nice and easy to understand diagram.
@joshuafishman9002
@joshuafishman9002 7 лет назад
I'm glad you made this video. Now I don't have to download the vector for every word on twitter.
@sufyanqadeer2705
@sufyanqadeer2705 6 лет назад
hello, friend. I need word file that was available on this side.Now the link is not working. Help me and send me the words vector file.Please link : embeddings.macheads101.com/
@sufyanqadeer2705
@sufyanqadeer2705 6 лет назад
My Email : sufyan.ali7272@gmail.com
@jabusch24
@jabusch24 6 лет назад
This is really well explained. Best word2vec explanation ive seen on youtube so far.
@bean_TM
@bean_TM Год назад
I haven't seen a better explanation on this. Thank you. This was really good.
@sniperas96
@sniperas96 2 года назад
still in 2022 much clearer explanation than my professor on my master.
@lakshmisairamthubati9080
@lakshmisairamthubati9080 6 лет назад
Probably the most clear explanation of word2vec. Thanks for the video.
@blancheporter1289
@blancheporter1289 4 года назад
one word, awesome. Thanks a lot for the video. Humble, concise, brilliant... Subscribed! Why have you stopped making videos man. Miss your vids
@kenchu764
@kenchu764 3 года назад
I just started learning ML concepts, and this video helped tremendously with word embeddings. You got another subscriber. I do have a question though. In your example, how did you decide on 64 as the other dimension of your factored matrices? Would a larger number there give you a better word embedding?
@govinda1993
@govinda1993 5 лет назад
i really appreciated the emphasis you gave on word embeddings rather than on word2vec.
@coolshoos
@coolshoos 6 лет назад
Glad to see a relatively new video from you guys. I'm an old-time fan. And this is exactly what I'm attempting to learn right now.
@ghazibenyoussef8424
@ghazibenyoussef8424 Год назад
Im new in AI, but impressed by what happens now and this hype around NLP and deep learning. Im self learning about all this. I really like what you have done, downloading tweets and word embedding. Is it possuble to access to your source code Tks
@naveenkalhan95
@naveenkalhan95 4 года назад
amazing man... i was going through 10's and 10's of online respurces to understand what is word embeddings! They way you explained it made me directly subscribe to your channel... very well. thank you very much
@ritik84629
@ritik84629 Год назад
Temperature: 5 years ago Feels like 15 years ago
@johnandersontorresmosquera1156
@johnandersontorresmosquera1156 3 года назад
Awesome explanation , after hours of looking for good stuff to understand the word embeddings. Thanks !
@nimeshsingh9271
@nimeshsingh9271 4 года назад
This is much better explanation than that available on some of the paid courses.
@TestTest-tj4nt
@TestTest-tj4nt 9 месяцев назад
The app is still up, impressive.
@Aviator168
@Aviator168 5 лет назад
Great video. I was having difficulty understand 'context'. You explained clearly. Thank you.
@user-fy5go3rh8p
@user-fy5go3rh8p 3 года назад
I don't get, why all the dislikes, the explanation is great.
@sadeebahsan4804
@sadeebahsan4804 5 лет назад
this is really intuitive. most places I got answers like representing words with vectors which wasn't helpful. now i think i have a proper idea.
@poonritchie
@poonritchie 6 лет назад
HI Macheads, I have ran into so many hopelessly disappointing video presenters or live 'trainer" who just talk to themselves. Evenworse they make me confused about areas i already know and make you even confused about things you know. haha (even from the top IT corporations). Hope you can talk in our upcoming training session - just to demonstrate what is a quality presentation of tech ideas. You have inborn ability to explain and motivate
@haridotvenkat
@haridotvenkat 6 лет назад
Excellent work. I was looking for such an explanation on word embeddings & I am happy that I found this. Thanks.
@radanici
@radanici 5 лет назад
Just starting out to venture into this world. Thanks for the explanation. I have a medical background, but no background on computer science. So this gives me a little bit hope in learning totally something new.
@jessicas2978
@jessicas2978 4 года назад
Thank you so much for your video! It's the best learning material I can find on youtube.
@partheshsoni1905
@partheshsoni1905 5 лет назад
I liked the way you explained...crisp and clear!
@poonritchie
@poonritchie 6 лет назад
I just learning embedding layers and luckily I ran into this video.
@anamqureshi5263
@anamqureshi5263 2 года назад
Great Explanation!
@AmeerHamza-jy5ml
@AmeerHamza-jy5ml 2 года назад
Hope you are doing good, I'm interested doing this thing in Urdu language. I wish you contribute with me to do this. thanks
@cristianjuarez1086
@cristianjuarez1086 2 года назад
I wish I could understand word embeddings just as well as you, im still a begginer as for now but this is what I want to become. Also I share your love for WE specially because I want to develop a NLP or a language model that generates answers but its too ambitious at the moment
@vijeta268
@vijeta268 4 года назад
Your explanation was very clear and simple, thanks for making this video.
@Nova-Rift
@Nova-Rift 2 года назад
very well explained imo
@yoniziv
@yoniziv 3 года назад
This is gold! thank you (from the future :-))
@alfital2
@alfital2 3 года назад
Awesome explanation, thanks.
@camilaferraz8153
@camilaferraz8153 2 года назад
Thanks for sharing! It helped a lot!
@darsh_shukla
@darsh_shukla 6 лет назад
Man you are my teacher from now onwards.
@MSFTSTRIO
@MSFTSTRIO 5 лет назад
Consider changing the tags and title of this video to something more like "Word2Vec uses" or something along those lines because This is the specific video I searched for but it was quite far down in the search
@rasyaramesh7433
@rasyaramesh7433 4 года назад
omg you look kinda like hiccup from how to train your dragon xD also thank you so much you literally just saved my life
@vanbap
@vanbap 2 года назад
I really appreciate this video sir !
@smritidey2942
@smritidey2942 5 лет назад
Oh Man u are excellent in explaining word2vec, hoping to see some more in text NLP.
@varunverma744
@varunverma744 5 лет назад
That was an amazing explanation. Thank you!
@BlockDesignz
@BlockDesignz 4 года назад
This was brilliant. Keep on creating.
@garbour456
@garbour456 6 лет назад
Awesome video man. Extremely well presented. I'm impressed with your presentation skills. thanks for the video
@nishankbani3257
@nishankbani3257 6 лет назад
Informative, interesting. Raised my interest in the topic of word embedding
@100timezcooler
@100timezcooler Год назад
wtf happened to this guy. his explanations are gold
@avaneeshpandey5611
@avaneeshpandey5611 2 месяца назад
he died in 2021 due to cancer
@jaysaha1967
@jaysaha1967 4 года назад
The website is really cool🔥
@barteksielicki7276
@barteksielicki7276 6 лет назад
Great explanation!
@NahinAndroid
@NahinAndroid 3 года назад
Beautiful, great work
@emenikeanigbogu9368
@emenikeanigbogu9368 4 года назад
Amazing man. Thank you for your time!
@FuZZbaLLbee
@FuZZbaLLbee 6 лет назад
I think the famous calculation example is King - man + woman = queen Update: ah I see, you where referring to the example image on the glove website. Anyway interesting stuff will take a look at glove as well. Currently trying to see if I can make a resumé / job listing similarity model. I guess I can do something with cos distance of terms in the resumé to terms in the job listing. Ideas are welcome
@keres993
@keres993 5 лет назад
Brilliant explanation! Thank you!
@adage3256
@adage3256 6 лет назад
Awesome recap !
@azai.mp4
@azai.mp4 6 лет назад
I'm wondering if something similar to Disentangled Variational Autoencoding could be used to improve a word2vec embedding. I'm not quite sure on the details, but it seems DVA has an effect similar to stuff like factor analysis, and principal component analysis, producing a latent space whose dimensions are more akin to real "separate" dimensions. Aka, producing separate dimensions for the italic-ness and boldness of a written digit, as seen in the paper Disentangled Variational Auto-Encoder for Semi-supervised Learning by Yang Li et al. (I would link it but RU-vid has a history of assuming comments with links in them are spam.) If that technique translates well into word vectors, it could for example result in a model where "maleness" is its own dimensions. i.e. "man" - "woman" ~= "king" - "queen" ~= (0, 0, 0, 1, 0, 0, ...) (A large vector that is parallel to one of the dimensions.) Another interesting venture would be to pre-process the data using an NLP library, so that different forms of the same lemma are already grouped together by default, and so that homographs can be separated. It could also expose information that a skip-gram or bag-of-words model would miss, such as dependency / sentence structure. I really ought to get my hands dirty some time instead of just thinking about this stuff in my head...
@vaibhavvaghela6234
@vaibhavvaghela6234 6 лет назад
Why have you stopped making videos man. Miss your vids
@bananakiu
@bananakiu 3 года назад
great video!
@reactorscience
@reactorscience 4 года назад
Great explanation!!!
@vidurwadhwa6897
@vidurwadhwa6897 6 лет назад
Great explanation!! Thanks a lot
@lenant
@lenant 6 лет назад
Very nice explanation, thanks
@jaradcollier2677
@jaradcollier2677 6 лет назад
Is this really how word embeddings work? Like, I'm blown away that this complex thought of what word embeddings were are just a word co-occurrence matrix decomposed into a smaller matrix. Is there more to it than that or is that the gist? More specifically, take text (documents), transform into tfidf 1-word token. Do dot product on tfifd matrix to get a square matrix (co-occurrence matrix). Take that and decompose it to say 64 components. Each of those rows are your word-vectors? The entire matrix is the word embedding at that point?
@ridhwanfranc
@ridhwanfranc 3 года назад
amazing video bro
@medhj9679
@medhj9679 4 года назад
Thanks man ! good explanation
@yeahorightbro
@yeahorightbro 7 лет назад
Just checked out the web app and was wondering how you put that together? Django? It is brilliant!
@piyalikarmakar5979
@piyalikarmakar5979 3 года назад
Sir, I have one query that what exactly the output layer predicts? The embedding of the input word or the context of input word?
@yuxiang3147
@yuxiang3147 2 года назад
Can you just decompose a square matrix like this? The square matrix has 10^10 entries but the two decomposed matrices only have in total 1.28*10^7 unknowns, so this means you have 10^10 equations to solve for 1.28*10^7 unknowns, and you won't get a solution to fulfill all entries in the original matrix. How do you deal with this?
@ROHAN0APK
@ROHAN0APK 6 лет назад
Brilliant video! Thanks :)
@mahdip.4674
@mahdip.4674 6 лет назад
Thanks for the video. I have seen GloVe models that contain the stop words and basically it means that at least they do not remove stop words. I assume one can remove tham and create two different model or vectors. If so, I assume there is not that much space to talk about precision of the two approach. Right? The other thing is that in case of embedding we do not apply stemming or other similar techniques, since the process is largely on context level. Right?
@bieberssaman3805
@bieberssaman3805 4 года назад
Wow Great Video.
@alexanderblumin6659
@alexanderblumin6659 2 года назад
Very helpful!!!
@gabriellevaillant5153
@gabriellevaillant5153 4 года назад
If i understood well. You have to multiply the first weight matrix (between input layer and hidden layer) with all word vector (composed of 0 and 1) to obtain the embedding matrix ? (weight matrix obtained with BackWard Propagation etc... ?) Thanks :)
@techynerdz9566
@techynerdz9566 6 лет назад
Hey how long did it take to download all that twitter data? Also, did you run it in the cloud or directly on your mac and if you ran it in the cloud, what service provider did u use? Thanks for your videos
@matthisc5100
@matthisc5100 2 года назад
nice video, thanks
@kanwar793
@kanwar793 7 лет назад
Am a regular follower.. Keep up!!
@deniscandido4116
@deniscandido4116 7 лет назад
Hello, do you invested time on learning all Calculus things like doing some partial derivative by hand or you can abstract this? I'm kind of slipping when I see mathematical content... but I'm able to build a CNN on Tensorflow without problems. Are this painfull way?
@amardeepganguly6676
@amardeepganguly6676 5 лет назад
Amazing explanantion brother thank you
@mgevirtz
@mgevirtz 2 года назад
You are wrong. Today is the most beautiful color.
@juleswombat5309
@juleswombat5309 5 лет назад
That was pretty awesome. It would be nice to see some code.
@vallurirajesh
@vallurirajesh Год назад
Steve Jobs, was that you?
@manedinesh
@manedinesh 5 лет назад
very well explained word2vec Vs glove, thank you.
@Pakrdjdjdnsnsmskzozovyff
@Pakrdjdjdnsnsmskzozovyff 5 лет назад
Great Job!
@ugurkaraaslan9285
@ugurkaraaslan9285 3 года назад
Thank you very much. How can we decide number of features in embedding matrix? When you deal with colors you have 3 features (R,G,B) but how can i define features for a 10000 words corpus? Thanks in advance.
@Schmuck
@Schmuck 7 лет назад
Hey, do you get many job offers? Your github is phenomenal and you have a fair sized youtube channel with tons of videos, so I'd imagine tons of companies would love to hire you (just a question out of curiosity)
@macheads101
@macheads101 7 лет назад
I get inquiries somewhat regularly, but not actual offers. Most recruiters just send out a generic-looking email asking if I'm interested in applying for a position. Occasionally, the CEO/CTO of a small company will contact me directly, but they still word their emails more as a question than as an offer.
@marcelobeckmann9552
@marcelobeckmann9552 6 лет назад
They are still trying to understand the great job this genius is doing, for this reason they are far to make a proposal to him, because they don't know what to do with the huge volume of knowledge he holds ;)
@AIandtheworld
@AIandtheworld 6 лет назад
what's your github? :D
@hamidkhalil9598
@hamidkhalil9598 5 лет назад
subscribed after 1 minute!
@hugopristauz538
@hugopristauz538 11 месяцев назад
good job
@ludennis0606
@ludennis0606 6 лет назад
def of word embeddings in one sentence: 4:21. Thanks for the cool tool and explanation. TU.
@bilalchandiabaloch8464
@bilalchandiabaloch8464 4 года назад
Superb.
@blizzy78
@blizzy78 Год назад
Greetings from a GPT-4 future! 👋
@nssSmooge
@nssSmooge 5 лет назад
So far I was able to only use normal dtm tfidf to compare speeches given at un - using text2vec in R. Not sure if I used it correctly though. It has an option for glove too, which I want to try out because I am a beginner and started in R. Python is soo confusing for me and do not let me start on gensin packages and docs.
Далее
RBF Networks
20:10
Просмотров 52 тыс.
A Complete Overview of Word Embeddings
17:17
Просмотров 105 тыс.
الذرة أنقذت حياتي🌽😱
00:27
Просмотров 14 млн
Vectoring Words (Word Embeddings) - Computerphile
16:56
Q&A - Hierarchical Softmax in word2vec
18:23
Просмотров 14 тыс.
NLP for Developers: Word Embeddings | Rasa
6:37
Просмотров 43 тыс.
Markov Chains and Text Generation
12:26
Просмотров 21 тыс.
Word Embeddings - EXPLAINED!
10:06
Просмотров 13 тыс.
Understanding Word2Vec
17:52
Просмотров 77 тыс.
Word2Vec : Natural Language Processing
13:17
Просмотров 37 тыс.
ELMo (Deep contextualized word representations)
8:54
12 000 рублей за это? Xiaomi Fold 3
0:58
Просмотров 200 тыс.