Тёмный
No video :(

[Demo] Domain-Adaptive Pretrained LM with Mixture of Experts | Multilingual MoE | The Fifth Elephant 

Hasgeek TV
Подписаться 36 тыс.
Просмотров 72
50% 1

Winning project 🏆
Improving Performance Across Languages and Domains in Language Models
How to transform the landscape of language models through domain-adaptive pretrained models? This project proposes an innovative approach that involves leveraging the power of Mixture of Experts (MoE) to create specialized models for unique languages.
By combining individual model strengths, the aim here is to enhance performance across diverse domains, improving the efficiency and versatility of Language Model adaptation. This includes training new BPE sentence piece tokenizers for Hindi and Kannada, and pre-training models on tasks such as machine translation, context learning, question answering, and text classification.
It implements a Mixture of Experts setup with Switch Transformers's Routing Algorithm, which is set to redefine language model adaptation.
Follow the project on: hasgeek.com/fi...
Demo by Akash Kamalesh, Anirudh Lakhotia, Tanistha Hota
🏆 The prizes for this hackathon have been sponsored by Meta
👉🏽 Visit has.gy/at1k to catch up on the happenings at The Fifth Elephant Open Source AI Hackathon. Stay tuned for the announcement of the next edition of the Open Source AI Hackathon.
#Hasgeek #TheFifthElephant #LanguageModels #DomainAdaptation #MixtureOfExperts #LLMs #MachineTranslation

Опубликовано:

 

5 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии    
Далее
Напугал рыжего малыша😂
01:00
Просмотров 47 тыс.
A Hackers' Guide to Language Models
1:31:13
Просмотров 523 тыс.