Learning to Remember More with Less Memorization​

ICLR 2019 (Oral) paper - slide

Posted by Jexus on January 7, 2019

Recently by the same author:


美國 EECS 博士班申請經驗分享 (ML/DL/NLP/Speech)

2021 Fall NLP/Speech PhD Application

You may find interesting:


My Work - Lifelong Language Knowledge Distillation

EMNLP 2020 long paper


My Work - Dual Inference for Improving Language Understanding and Generation

EMNLP 2020 findings paper

Learning to Remember More with Less Memorization​

Paper Link

TL;DR

Memory-augmented Neural Network 如 Neural Turing Machine (NTM), differentiable neural computer (DNC) 雖然看起來很潮,但往往要 train 很久 or 天荒地老,而且表現有時候不一定較好,這篇 paper 透過讓 model 寫入寫出 memory 的次數規律化,減少了訓練時間,而表現反而變好,主要原因是減少了 model 無謂的寫入與寫出。

Slide:

Please wait a minute for the embedded frame to be displayed. Reading it on a computer screen is better.