Тёмный

Paper Explained | LLM boosts GNN Accuracy via Knowledge Distillation 

Jack See
Подписаться 311
Просмотров 80
50% 1

Paper: Large Language Model Meets Graph Neural Network in Knowledge Distillation
Conclusion: In this paper, we propose a novel LLM-to-GNN knowledge
distillation framework termed LinguGKD, which integrates the
semantic understanding capabilities of LLMs with the efficiency and structural insights of GNNs. LinguGKD employs TAG-oriented instruction tuning to train pre-trained LLMs
as teacher models and introduces a layer-adaptive contrastive
distillation strategy to align and transfer node features between
teacher LLMs and student GNNs within a latent space. Extensive experiments across various LLM and GNN architectures on multiple datasets demonstrates that LinguGKD significantly
enhances the predictive accuracy and convergence rate of
GNNs without requiring additional training data or model
parameters, making them highly practical for deployment
in resource-constrained environments. Moreover, LinguGKD
shows great potential for leveraging advancements in LLM
research to continuously augment GNN performance.
~~~~~~~~
Hi there, I am Jack See, a PhD student who is working on AI models for molecular graph prediction. In this video, I will be explaining knowledge distillation from LLM to GNN. Enjoy yourself and leave any comments!
Find me on:
-Twitter: https:/_/ JackSee47284524 (remove the underscore)
-Linkedin: https:/_/www.linkedin.com/in/jack-see-096212244/ (remove the underscore)
#ai #research #airesearch #machinelearning #deeplearning #largelanguagemodels

Опубликовано:

 

16 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 2   
@criticalnodecapital
@criticalnodecapital 9 дней назад
keep it up.. i need to do this.
@JackSee-wr3le
@JackSee-wr3le 9 дней назад
Thanks man. Good luck to you as well!
Далее
Paper Explained | GPT for Graph Theory
57:48
Cursor AI tutorial for beginners
40:35
Просмотров 54 тыс.
AI can't cross this line and we don't know why.
24:07
Просмотров 458 тыс.
Intro to graph neural networks (ML Tech Talks)
51:06
Просмотров 176 тыс.
AI is making AI!??
34:24
Просмотров 55
ICML 2024 Tutorial: Physics of Language Models
1:53:43
Просмотров 19 тыс.
[1hr Talk] Intro to Large Language Models
59:48
Просмотров 2,1 млн
The AI Bubble: Will It Burst, and What Comes After?
1:12:17