Тёмный

Word2Vec and how AI started 10 years ago a revolution of NLP 

CodeWrecks
Подписаться 1,3 тыс.
Просмотров 62
50% 1

More than ten years ago, researchers in Google tried to train a Neural Network to assign to each word a Vector (a sequence of numbers) in a way that the vector captures semantic of the word.
You can find notebook used in this example here: github.com/alkampfergit/ai-pl...
▬ Contents of this video ▬▬▬▬▬▬▬▬▬▬
00:00 - Introduction to Word2Vec
01:15 - Explanation and Concepts Demonstrated in Python Code
01:31 - Famous Semantics Experiment Example
02:41 - Code Example of Semantic Vectorization
05:18 - Introduction to Sentence Vectorization
05:57 - Transition to a Model Trained for Embedding
08:02 - Transforming Words into a Vector Using the Distilbert Model
10:08 - Function Creation for Semantic Vectorization and Math Operations
12:32 - Conclusions

Наука

Опубликовано:

 

29 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 2   
@bitbreaker79
@bitbreaker79 Месяц назад
Nice video! Clear in your explanation (as usual) ... I'm looking forward tinkering on it once I got a decent PC with enough RAM and GPU.
@codewrecks
@codewrecks Месяц назад
Just try with online services :), OpenAI / Cohere, etc. Also local embedding models really need small GPU to run on decent speed, they are really small compared to an LLM.
Далее
The moment we stopped understanding AI [AlexNet]
17:38
Просмотров 798 тыс.
ВОТ ЧТО МЫ КУПИЛИ НА ALIEXPRESS
09:35
Наше обычное утро 💕
00:42
Просмотров 1,5 млн
How AI 'Understands' Images (CLIP) - Computerphile
18:05
This is why Deep Learning is really weird.
2:06:38
Просмотров 375 тыс.
I wish every AI Engineer could watch this.
33:49
Просмотров 74 тыс.
#engineering #diy #amazing #electronic #fyp
0:59
Просмотров 2,3 млн
How to Soldering wire in Factory ?
0:10
Просмотров 5 млн