Learning to Remember More with Less Memorization
TL;DR
Memory-augmented Neural Network 如 Neural Turing Machine (NTM), differentiable neural computer (DNC) 雖然看起來很潮,但往往要 train 很久 or 天荒地老,而且表現有時候不一定較好,這篇 paper 透過讓 model 寫入寫出 memory 的次數規律化,減少了訓練時間,而表現反而變好,主要原因是減少了 model 無謂的寫入與寫出。
Slide:
Please wait a minute for the embedded frame to be displayed. Reading it on a computer screen is better.