Publications

If you use SGNMT in your work, please cite the following paper:

Felix Stahlberg, Eva Hasler, Danielle Saunders, and Bill Byrne. SGNMT - A Flexible NMT Decoding Platform for Quick Prototyping of New Models and Search Strategies. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing (EMNLP 17 Demo Session), September 2017. Copenhagen, Denmark. arXiv/1707.06885

Additionally, SGNMT has been used in the following publications:

  • Felix Stahlberg, Danielle Saunders, Bill Byrne. An operation sequence model for explainable neural machine translation. In Proceedings of the EMNLP 2018 BlackboxNLP workshop, November 2018. Brussels, Belgium. arXiv/1808.09688
  • Felix Stahlberg, Adria de Gispert, and Bill Byrne. The University of Cambridge’s machine translation systems for WMT18. In Proceedings of the 3rd conference on machine translation (WMT 18), November 2018. Brussels, Belgium. arXiv/1808.09465
  • Danielle Saunders, Felix Stahlberg, Adria de Gispert, and Bill Byrne. Multi-representation ensembles and delayed SGD updates improve syntax-based NMT. In Proceedings of the 56th annual meeting of the Association for Computational Linguistics (ACL 18), July 2018. Melbourne, Australia. arXiv/1805.00456
  • Gregory Kell. Overcoming catastrophic forgetting in neural machine translation. MPhil dissertation, University of Cambridge, 2018.
  • Zhiwei Wang. Simultaneous neural machine translation. 4th year project, University of Cambridge, 2018.
  • Felix Stahlberg, Danielle Saunders, Gonzalo Iglesias, Bill Byrne. Why not be versatile? Applications of the SGNMT decoder for machine translation. In Proceedings of the 13th biennial conference by the Association for Machine Translation in the Americas (AMTA 2018), March 2018. Boston, USA. arXiv/1803.07204
  • Felix Stahlberg and Bill Byrne. Unfolding and shrinking neural machine translation ensembles. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing (EMNLP 17), September 2017. Copenhagen, Denmark. arXiv/1704.03279
  • Eva Hasler, Felix Stahlberg, Marcus Tomalin, Adrià de Gispert, Bill Byrne. A Comparison of Neural Models for Word Ordering In Proceedings of the 10th International Conference on Natural Language Generation (INLG 2017), September 2017. Santiago de Compostela, Spain. arXiv/1708.01809
  • Felix Stahlberg, Adrià de Gispert, Eva Hasler, and Bill Byrne. Neural machine translation by minimising the Bayes-risk with respect to syntactic translation lattices. In Proceedings of the 15th annual meeting of the European Chapter of the Association for Computational Linguistics (EACL 17), April 2017. Valencia, Spain. arXiv/1612.03791
  • Eva Hasler, Adrià de Gispert, Felix Stahlberg, Aurelien Waite, and Bill Byrne. Source sentence simplification for statistical machine translation. In Computer Speech & Language ScienceDirect
  • Marcin Tomczak. Bachbot. MPhil dissertation, University of Cambridge, 2016. Bachbot
  • Jiameng Gao. Variable length word encodings for neural translation models. MPhil dissertation, University of Cambridge, 2016. Variable length encodings
  • Felix Stahlberg, Eva Hasler, and Bill Byrne. The edit distance transducer in action: the University of Cambridge English-German system at WMT16. In Proceedings of the 1st conference on machine translation (WMT 16), August 2016. Berlin, Germany. arXiv/1606.04963
  • Felix Stahlberg, Eva Hasler, Aurelien Waite, and Bill Byrne. Syntactically guided neural machine translation. In Proceedings of the 54th annual meeting of the Association for Computational Linguistics (ACL 16), August 2016. Berlin, Germany. arXiv/1605.04569