Тёмный

Self-attention mechanism explained | Self-attention explained | scaled dot product attention 

Unfold Data Science
Подписаться 85 тыс.
Просмотров 703
50% 1

Self-attention mechanism explained | Self-attention explained | self-attention in deep learning
#ai #datascience #machinelearning
Hello,
My name is Aman and I am a Data Scientist.
All amazing data science courses at the most affordable price here: www.unfolddatascience.com
Book one on one session here(Note - These supports are chargable): docs.google.com/forms/d/1Wgle...
Follow on Instagram: unfold_data_science
About Unfold Data science: This channel is to help people understand the basics of data science through simple examples in an easy way. Anybody without prior knowledge of computer programming or statistics or machine learning and artificial intelligence can get an understanding of data science at a high level through this channel. The videos uploaded will not be very technical in nature and hence can be easily grasped by viewers from different backgrounds as well.
Book recommendation for Data Science:
Category 1 - Must Read For Every Data Scientist:
The Elements of Statistical Learning by Trevor Hastie - amzn.to/37wMo9H
Python Data Science Handbook - amzn.to/31UCScm
Business Statistics By Ken Black - amzn.to/2LObAA5
Hands-On Machine Learning with Scikit Learn, Keras, and TensorFlow by Aurelien Geron - amzn.to/3gV8sO9
Category 2 - Overall Data Science:
The Art of Data Science By Roger D. Peng - amzn.to/2KD75aD
Predictive Analytics By By Eric Siegel - amzn.to/3nsQftV
Data Science for Business By Foster Provost - amzn.to/3ajN8QZ
Category 3 - Statistics and Mathematics:
Naked Statistics By Charles Wheelan - amzn.to/3gXLdmp
Practical Statistics for Data Scientist By Peter Bruce - amzn.to/37wL9Y5
Category 4 - Machine Learning:
Introduction to machine learning by Andreas C Muller - amzn.to/3oZ3X7T
The Hundred Page Machine Learning Book by Andriy Burkov - amzn.to/3pdqCxJ
Category 5 - Programming:
The Pragmatic Programmer by David Thomas - amzn.to/2WqWXVj
Clean Code by Robert C. Martin - amzn.to/3oYOdlt
My Studio Setup:
My Camera: amzn.to/3mwXI9I
My Mic: amzn.to/34phfD0
My Tripod: amzn.to/3r4HeJA
My Ring Light: amzn.to/3gZz00F
Join the Facebook group :
groups/41022...
Follow on medium: / amanrai77
Follow on quora: www.quora.com/profile/Aman-Ku...
Follow on Twitter: @unfoldds
Watch the Introduction to Data Science full playlist here: • Data Science In 15 Min...
Watch python for data science playlist here:
• Python Basics For Data...
Watch the statistics and mathematics playlist here :
• Measures of Central Te...
Watch End to End Implementation of a simple machine-learning model in Python here:
• How Does Machine Learn...
Learn Ensemble Model, Bagging, and Boosting here:
• Introduction to Ensemb...
Build Career in Data Science Playlist:
• Channel updates - Unfo...
Artificial Neural Network and Deep Learning Playlist:
• Intuition behind neura...
Natural language Processing playlist:
• Natural Language Proce...
Understanding and building a recommendation system:
• Recommendation System ...
Access all my codes here:
drive.google.com/drive/folder...
Have a different question for me? Ask me here : docs.google.com/forms/d/1ccgl...
My Music: www.bensound.com/royalty-free...

Опубликовано:

 

15 май 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 12   
@user-kt5qu2ne7l
@user-kt5qu2ne7l 15 дней назад
Thank you for your clear explanation
@ajitkulkarni1702
@ajitkulkarni1702 14 дней назад
Best expalinination on self attention !!!
@jayeshsingh116
@jayeshsingh116 14 дней назад
well explained thank you for covering these topics
@AnkitGupta-rj4yy
@AnkitGupta-rj4yy 13 дней назад
Thank you for provide us ❤ in easy way
@dinu9670
@dinu9670 10 дней назад
You are a saviour man. Great explanation. Please keep doing these videos 🙏
@UnfoldDataScience
@UnfoldDataScience 9 дней назад
Thanks, will do!
@funwithtechnology6526
@funwithtechnology6526 8 дней назад
Thank you for the very clear explanation :) . I have a small question here. In self-attention, is there a limit to the dimension of the final attention embedding space?
@manoj1bk
@manoj1bk 15 дней назад
can be used as self attention mechanism(as an embedding layer) before LSTM in the context of time series analysis?
@dhirajpatil6776
@dhirajpatil6776 14 дней назад
Please made video on explanation of transformers architecture
@ajitkulkarni1702
@ajitkulkarni1702 14 дней назад
Please make viodes on multi head attention...
@RakeshKumarSharma-nc3cj
@RakeshKumarSharma-nc3cj 14 дней назад
awesome video
@UnfoldDataScience
@UnfoldDataScience 11 дней назад
Thanks Rakesh
Далее
Self-Attention Using Scaled Dot-Product Approach
16:09
Tragic Moments 😥 #2
00:30
Просмотров 2 млн
MIRAVI - Ивы 31.05.2024
00:14
Просмотров 98 тыс.
How I got AWS Machine Learning Specialty Certified
8:35
Cross Attention | Method Explanation | Math Explained
13:06
Tragic Moments 😥 #2
00:30
Просмотров 2 млн