We help you wrap your head around relative positional embeddings as they were first introduced in the “Self-Attention with Relative Position Representations” paper.
➡️ AI Coffee Break Merch! 🛍️ aicoffeebreak.creator-spring....
Related videos:
📺 Positional embeddings explained: • Positional embeddings ...
📺 Concatenated, learned positional encodings: • Adding vs. concatenati...
📺 Transformer explained: • The Transformer neural...
Papers:
📄 Shaw, Peter, Jakob Uszkoreit, and Ashish Vaswani. "Self-Attention with Relative Position Representations." In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers), pp. 464-468. 2018. arxiv.org/pdf/1803.02155.pdf
📄 Vaswani, Ashish, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Łukasz Kaiser, and Illia Polosukhin. "Attention is all you need." In Advances in neural information processing systems, pp. 5998-6008. 2017. proceedings.neurips.cc/paper/...
💻 Implementation for Relative Position Embeddings: github.com/AliHaiderAhmad001/...
Outline:
00:00 Relative positional representations
02:15 How do they work?
07:59 Benefits of relative vs. absolute positional encodings
Music 🎵 : Holi Day Riddim - Konrad OldMoney
✍️ Arabic Subtitles by Ali Haidar Ahmad / ali-ahmad-0706a51bb .
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
🔥 Optionally, pay us a coffee to help with our Coffee Bean production! ☕
Patreon: / aicoffeebreak
Ko-fi: ko-fi.com/aicoffeebreak
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
🔗 Links:
AICoffeeBreakQuiz: / aicoffeebreak
Twitter: / aicoffeebreak
Reddit: / aicoffeebreak
RU-vid: / aicoffeebreak
#AICoffeeBreak #MsCoffeeBean #MachineLearning #AI #research
10 июл 2024