Тёмный

Enhance Cost Efficiency in Domain Adaptation with PruneMe 

MLOps World: Machine Learning in Production
Подписаться 2,9 тыс.
Просмотров 65
50% 1

Speaker: Shamane Siri, Ph.D. , Head of Applied NLP Research, Arcee.ai
Our PruneMe repository, inspired by "The Unreasonable Ineffectiveness of the Deeper Layers," demonstrates a layer pruning technique for Large Language Models (LLMs) that enhances cost efficiency in domain adaptation. By removing redundant layers, we facilitate continual pre-training on streamlined models. Subsequently, these models can be merged into a top-performing general model using advanced techniques like Evolve Merging, offering a cost-effective approach to model optimization and adaptation.

Опубликовано:

 

15 май 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии    
Далее
نترس تو برق نبود😅😅
00:17
Просмотров 934 тыс.
Private, Local AI
36:56
Просмотров 229