What are the newest great techniques to improve NLP/NLU transformer model performances?

Owen L
1 reply

Replies

Pascal de Buren
ML engineer, maker and climber of things
I personally love the ideas around adding an external memory to transformers since it's a simple way to add new vocabulary without training a 100bn+ transformer model. I found this paper by Google on Memorizing Transformers especially insightful: https://arxiv.org/abs/2203.08913