With the addition of dynamic memory access and storage mechanism, we present a neural architecture that will serve as a language-agnostic text normalization system while avoiding the kind of unacceptable errors made by the LSTM based recurrent neural networks. Our proposed system requires significantly lesser amounts of data, training time and compute resources.
Subhojeet Pramanik, Aman Hussain
Speech Communication (EURASIP & ISCA) Elsevier,
2018