Multi-representation ensembles and delayed SGD updates improve syntax-based NMT

Danielle Saunders, Felix Stahlberg, Adrià de Gispert, Bill Byrne

We explore strategies for incorporating target syntax into Neural Machine Translation. We specifically focus on syntax in ensembles containing multiple sentence representations. We formulate beam search over such ensembles using WFSTs, and describe a delayed SGD update training procedure that is especially effective for long representations like linearized syntax. Our approach gives state-of-the-art performance on a difficult Japanese-English task.