Тёмный

Live Day 4-Word Embedding, CBOW And Skipgram Word2vec NLP And Quiz-5000Inr Give Away 

Krish Naik
Подписаться 1 млн
Просмотров 71 тыс.
50% 1

All materials will be added in the below dashboard. Enroll for free
ineuron.ai/course/NLP-Foundat...
We are happy to announce iNeuron is coming up with the 6 months Live Full Stack Data Analytics batch with job assistance and internship starting from 18th June 2022.The instructor of the course will be me and Sudhanshu. The course price is really affordable 4000rs Inr including GST.
The course content will be available for lifetime along with prerecorded videos.
You can check the course syllabus below
Course link: courses.ineuron.ai/Full-Stack...
From my side you can avail addition 10% off by using Krish10 coupon code.
Don't miss this opportunity and grab it before it's too late. Happy Learning!!

Опубликовано:

 

19 июн 2022

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 44   
@mengzhuwang7233
@mengzhuwang7233 Год назад
I really appreciate ur efforts ! and whenever am in doubt of ML concepts, the first thing comes to my mind is to search through ur channel ! I really love ur teaching ! and idk how many times i have commented this in the comment section !!!
@aftabnaseem
@aftabnaseem 11 месяцев назад
You are one of the best teacher here. Whatever you teach, goes through so easily. I appreciate your dedication. Thumbs up from Pakistan
@ritikkohad5045
@ritikkohad5045 Год назад
For the people who are getting confused between cosine similarity and cosine distance. In cosine similarity if the value is towards 1 then it would be considered as very similar. If the value is towards 0 then there is no or less similarity between the two points or vectors. In the video at 26:50 Krish sir is talking about cosine distance not cosine similarity. So in cosine distance if the value is more towards 0 then there is a similarity between the points or vectors and vice versa.
@mirzaimadbaig
@mirzaimadbaig 5 месяцев назад
thx man!
@theraizadatalks14
@theraizadatalks14 Месяц назад
Yes, you're correct! Cosine Similarity: Measures similarity, ranging from -1 (dissimilar) to 1 (similar). A higher value means more similarity in direction. Cosine Distance: Although not a standard term, it is commonly used to denote 1−cosine_similarity. It quantifies dissimilarity, where a lower value (closer to 0) means more similarity and a higher value means more dissimilarity.
@geovanacarreira6110
@geovanacarreira6110 Год назад
I'm really loving the content. Thank you for your time and patience.
@santhoshvictor9509
@santhoshvictor9509 9 месяцев назад
Hi Krish, I'm super grateful for your videos. You are helping a lot of students who cannot afford to pay and learn. I got into DS because of you. Thank you for everything ! please continue to do whatever you are doing.
@laythherzallah3493
@laythherzallah3493 2 года назад
I don't know how to thank you man .. you are the best
@progamer0256
@progamer0256 2 года назад
Sir after nlp start community session of computer vision complete series
@akshayshendre7408
@akshayshendre7408 Год назад
Thank you Krish, really grateful for the video. I have learnt almost everything from your videos. However sometimes it is really difficult to understand the context between the lines and sometimes you say self contradictory things. Similarly here it was difficult to understand the use of O/P feature (Is, related, to ) and the o/p layer of ANN which again without context again comes down to the words in document. please try to elaborate in a sequential model. Maybe its just me who is facing the issue to understand and maybe this comment will get ignore all together. But thanks for the efforts. I will try to browse word2vec for better understanding. I have gone through each community session of ML and deep learning and now I am on NLP. I am following the roadmap you have suggested. I am also in between a career transition from core disease and biology research to data science. Hopefully I will be able to make a successful transition. Keep the good word going for people like me...
@pruthwikmishra5368
@pruthwikmishra5368 2 года назад
Vocabulary in one hot encoding is usually a set, so the size of the vocabulary in question no 2 is 3, not 4. One hot encoding is a single 1 with rest as zeros.
@omarsalam7586
@omarsalam7586 Год назад
Thank you for your time and patience ❤
@drzahraamini6418
@drzahraamini6418 2 года назад
Krish,, Great video again, thank you so much. Just one question: what hardware are you using to write with your notes in these videos? Appreciate your answer.
@mrityunjayupadhyay7332
@mrityunjayupadhyay7332 2 года назад
Thanks a lot for this amazing lecture
@jaffa-nm6cu
@jaffa-nm6cu Год назад
@1:16:52 ,word embeeding doesnt have 500 dimensions for 500 words if thats the case then there is no difference between BOW and Word embeedings
@Ayyappamanasu
@Ayyappamanasu 2 года назад
Thanks!
@harshitsingh715
@harshitsingh715 Год назад
Hard to understand totally without examples but thank you to the effort
@MMEELL11
@MMEELL11 Год назад
thank you for creating such value contents
@gourabbanerjee9531
@gourabbanerjee9531 6 месяцев назад
Hey I think that is one hot encoding you have shown for CBOW it's not bag of words
@rizkyputrakurniawan24
@rizkyputrakurniawan24 5 дней назад
Thank You Sir !
@__mothership__8475
@__mothership__8475 Год назад
46:46 So you mean If we want to represent a word with 300 dimensions then window_size has to be 300 ? Because in your case window_size is 5 and you represent word by 5 dimension vector
@osikoyaadeola2530
@osikoyaadeola2530 2 года назад
Thanks a lot sir.
@Arpitvijaywargiya
@Arpitvijaywargiya Месяц назад
@krish, @36:41 method you used to represent the sentence is One-Hot encoding not BOW
@suryarawat4829
@suryarawat4829 2 года назад
I am unable to see your notes on the dashboard. Could you please tell me the exact location where I can find those?
@marcpolmans2546
@marcpolmans2546 2 года назад
The best channel for learning Data Science. Thank you for all your effort and knowledge sharing!
@nishah4058
@nishah4058 2 года назад
One simple gratitude:thanku very much ..I cleared so many doubts .just one qsn if I will do codemix sentiment analysis how to consider Hindi words like I don't want to use Google translator like if the sentences:aaj to fun day tha..how I will input this via python??
@sumankumarsinha9168
@sumankumarsinha9168 2 года назад
Dashboard link is not working. Could you please check it? Thanks in Advance.
@sandipansarkar9211
@sandipansarkar9211 2 года назад
where is the google colab link.Can you please share.I am unable to find it
@hargovind2776
@hargovind2776 2 года назад
Keep teaching NLP sir!!
@riyaz8072
@riyaz8072 Год назад
'IS' is a stop word right.. we will remove it in the text pre processing stage.. then why are you considering 'is' as your output word ?
@shankarbasu9357
@shankarbasu9357 3 месяца назад
But noun and preposition conjunction will have high frequency
@projjalpaul3037
@projjalpaul3037 7 месяцев назад
sir i need the study metarial but the description link says page not available
@marsgalaxy6734
@marsgalaxy6734 2 года назад
Great
@user-gw5kf6ih9s
@user-gw5kf6ih9s 3 месяца назад
can anyone help me with the ipynb file in this video (please provide the link)?
@shankarbasu9357
@shankarbasu9357 3 месяца назад
One hot encoding if you can repeat please
@relexationmusic6705
@relexationmusic6705 7 месяцев назад
kindly share notebook file the link you provide removed form website or may have an issue
@gh504
@gh504 2 года назад
Super session
@nayeemmollick6065
@nayeemmollick6065 Год назад
Is there anyone else finding difficulties while installing punkt in NLTK
@dhanashreeyadav8220
@dhanashreeyadav8220 Год назад
Hi Sir, I am getting the follwoing error while importing gensim library: *ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject* Please help me to solve this error
@sukanyamukherjee961
@sukanyamukherjee961 7 месяцев назад
where are you getting the code from? i tried the link in the description box its showing 404 error. can you help
@sagarbp-2854
@sagarbp-2854 3 месяца назад
Is ineuron link working ?
@sam-uw3gf
@sam-uw3gf Месяц назад
no
@sandipansarkar9211
@sandipansarkar9211 2 года назад
finished watching
@jaffa-nm6cu
@jaffa-nm6cu Год назад
Ok
Далее
Советы на всё лето 4 @postworkllc
00:23
Word Embedding and Word2Vec, Clearly Explained!!!
16:12
A Complete Overview of Word Embeddings
17:17
Просмотров 103 тыс.
AI vs ML vs DL vs Generative Ai
16:00
Просмотров 37 тыс.
The moment we stopped understanding AI [AlexNet]
17:38
Просмотров 853 тыс.
How I'd Learn AI (If I Had to Start Over)
15:04
Просмотров 766 тыс.