## Matching Structure for Dual Learning

### Hao Fei · Shengqiong Wu · Yafeng Ren · Meishan Zhang

##### Hall E #210

Keywords: [ MISC: Representation Learning ] [ MISC: Supervised Learning ] [ MISC: Transfer, Multitask and Meta-learning ] [ DL: Other Representation Learning ] [ DL: Self-Supervised Learning ] [ APP: Language, Speech and Dialog ]

[ Abstract ]
[ [
Wed 20 Jul 3:30 p.m. PDT — 5:30 p.m. PDT

Spotlight presentation: Applications
Wed 20 Jul 1:30 p.m. PDT — 3 p.m. PDT

Abstract: Many natural language processing (NLP) tasks appear in dual forms, which are generally solved by dual learning technique that models the dualities between the coupled tasks. In this work, we propose to further enhance dual learning with structure matching that explicitly builds structural connections in between. Starting with the dual text$\leftrightarrow$text generation, we perform dually-syntactic structure co-echoing of the region of interest (RoI) between the task pair, together with a syntax cross-reconstruction at the decoding side. We next extend the idea to a text$\leftrightarrow$non-text setup, making alignment between the syntactic-semantic structure. Over 2*14 tasks covering 5 dual learning scenarios, the proposed structure matching method shows its significant effectiveness in enhancing existing dual learning. Our method can retrieve the key RoIs that are highly crucial to the task performance. Besides NLP tasks, it is also revealed that our approach has great potential in facilitating more non-text$\leftrightarrow$non-text scenarios.

Chat is not available.