Тёмный

Yikang Shen: Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks (ICLR2019) 

Steven Van Vaerenbergh
Подписаться 3,4 тыс.
Просмотров 2,2 тыс.
50% 1

Speaker: Yikang Shen
Paper: Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks
Authors: Yikang Shen, Shawn Tan, Alessandro Sordoni, Aaron Courville
In general, natural language is governed by a tree structure: smaller units (e.g., phrases) are nested within larger units (e.g., clauses). This is a strict hierarchy: when a larger constituent ends, all of the smaller constituents that are nested within it must also be closed. While the standard LSTM allows different neurons to track information at different time scales, the architecture does not impose a strict hierarchy. This paper proposes to add such a constraint to the system by ordering the neurons; a vector of "master" input and forget gates ensure that when a given unit is updated, all of the units that follow it in the ordering are also updated. To this end, we propose a new RNN unit: ON-LSTM, which achieves good performance on four different tasks: language modeling, unsupervised parsing, targeted syntactic evaluation, and logical inference.
Presented at ICLR 2019

Наука

Опубликовано:

 

5 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 1   
@DistortedV12
@DistortedV12 4 года назад
fantastic achievement
Далее
Long Short-Term Memory (LSTM), Clearly Explained
20:45
Просмотров 504 тыс.
POLI зовет Газана
00:12
Просмотров 217 тыс.
Transformer Neural Networks Derived from Scratch
18:08
Просмотров 127 тыс.
MIT 6.S191: Convolutional Neural Networks
1:07:58
Просмотров 42 тыс.
100+ Linux Things you Need to Know
12:23
Просмотров 76 тыс.
The Essential Main Ideas of Neural Networks
18:54
Просмотров 898 тыс.
What are Transformer Models and how do they work?
44:26
iPhone перегрелся, что делать?!
1:01