Тёмный
No video :(

What 's under ChatGPT's hood ? Deep learning Transformer Architecture Visually Explained 

Data Science Demonstrated
Подписаться 3,3 тыс.
Просмотров 356
50% 1

The deep learning transformer architecture , proposed by paper Attention is all you need, is basis of ChaGPT and many other large language model applications.
Get a visual view of transformer architecture in this video
00.00 Introduction
01:34 Data Inputs
02:30 Input Embedddings
03:48 Positional Encoding
04:36 Multi-head Attention
05:31 Feedd Forward
07:27 Try it out yourself Demo
Try out the demo at experiencedata...

Опубликовано:

 

29 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 3   
@jeanmarcel6131
@jeanmarcel6131 9 месяцев назад
Excellent explanation! Thanks
@samannwaysil4412
@samannwaysil4412 9 месяцев назад
Please try to enhance your audio quality.
@DataScienceDemonstrated
@DataScienceDemonstrated 9 месяцев назад
Will do . Thanks for the feedback
Далее
What are Transformer Models and how do they work?
44:26
버블티로 체감되는 요즘 물가2
00:15
Просмотров 2 млн
Why Does Diffusion Work Better than Auto-Regression?
20:18
How AI 'Understands' Images (CLIP) - Computerphile
18:05
Use ChatGPT to learn to code - the right way
7:19
Просмотров 52 тыс.
The Attention Mechanism in Large Language Models
21:02