More than ten years ago, researchers in Google tried to train a Neural Network to assign to each word a Vector (a sequence of numbers) in a way that the vector captures semantic of the word.
You can find notebook used in this example here: github.com/alkampfergit/ai-pl...
▬ Contents of this video ▬▬▬▬▬▬▬▬▬▬
00:00 - Introduction to Word2Vec
01:15 - Explanation and Concepts Demonstrated in Python Code
01:31 - Famous Semantics Experiment Example
02:41 - Code Example of Semantic Vectorization
05:18 - Introduction to Sentence Vectorization
05:57 - Transition to a Model Trained for Embedding
08:02 - Transforming Words into a Vector Using the Distilbert Model
10:08 - Function Creation for Semantic Vectorization and Math Operations
12:32 - Conclusions
29 июл 2024