TOTAL VIEWS: 4516
Machine translation has witnessed great development in the recent decades and we have entered the era of neural machine translation (NMT). A review of MT is necessary for a better understanding of the relationship between MT and human translators and translation teaching in this era when MT has flourished. This paper first briefs the machine translation (MT) development in the past decades, focusing on the features, application, and drawbacks of each main paradigm of rule-based machine translation (RBMT), corpus-based translation (CBMT), and long-short term memory (LSTM), a main paradigm of NMT. It continues with a discussion of what MT means to human translators and translation teaching in universities. It concludes that MT should not and could not replace human translators which will always be vital in some fields and aspects; only a good integration between the two can ensure satisfying output with post-editing by human translators to meet the increasingly demanding market. This signifies that translation teaching in universities should embrace MT knowledge.
Brown, P.F., Pietra, V. D. J., Pietra, S. A. D., & Mercer, R. (1993). The Mathematics of Statistical Machine Translation: Parameter Estimation. Computational Linguistics 19(2):263-311.
Chen W. (2020). The deconstruct of subjunctivity by machine translation--the foothold of machine translation. Foreign Languages Research 180(2):76-83.
Chiang, D. Hierarchical Phrase-Based Translation. 2007. Computational Linguistics (2007), 33 (2): 201-228.
Cui, Y.M., Wang, S.J. & Li, J.F. (2016). LSTM Neural Reordering Feature for Statistical Machine Translation). Conference: Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: 977-982.
Doherty, S., O’Brien, S., & Carl, M. (2010). Eye tracking as an MT evaluation technique. Transl. (24): 1-13.
Dorr, B. J., Jordan, P. W., & Benoit, J. W. (1999). A survey of current paradigms in machine translation, Adv. Computer, (49):1-68.
Forcada, M. (2017). Making sense of neural machine translation. Translation Spaces 6(2): 291-309.
Graves, A. & Schmidhuber, J. (2005). Framewise phoneme classification with bidirectional lstm and other neural network architectures. Neural Networks 18(5):602-610.
Greff, K., Srivastava, R.K., Koutnik, J., Steunebrink, B.R., & Schmidhuber, J. (2015). LSTM: A Search Space Odyssey. IEEE Transactions on Neural Networks and Learning Systems 28(10): 2222-2232.
Hochreiter, S. (1991). Untersuchungen zu dynamischen neuronalen Netzen. Master Thesis, Technische Universitat¨ Munchen, Munchen.
Hochreiter, S., Schmidhuber, J. (1997). Long short-term memory. Neural computation (9):1735-1780.
Hochreiter, S., Schmidhuber, J. (1997). Lstm can solve hard long time lag problems. In: M.C. Mozer, M.I. Jordan (eds.), Advances in Neural Information Processing Systems (9): 473-479. MIT Press.
Houdt, G.V., Mosquera, C., Napoles, G. (2020). A Review on the Long Short-Term Memory Model Cognitive Science & AI (53): 5929-5955.
Hutchins, W. J. (1995). Machine Translation: A Brief History. Computer Science: 431-445.
Jean, S., Cho, K., Roland Memisevic, & Bengio, Y. (2015). On using very large target vocabulary for neural machine translation. In ACL.[1412.2007] On Using Very Large Target Vocabulary for Neural Machine Translation (arxiv.org)
Kartsaklis, D., Pilehvar, M.T., & Collier, N. (2018). Mapping text to knowledge graphentities using multi-sense lstms. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing: 1959-1970.
Koehn P. K. & KNowles, R. (2017). Six Challenges for Neural Machine Translation. Proceedings of the First Workshop on Neural Machine Translation: 28-39, Vancouver, BC, Canada.
Liu, S., Yang, N., Li, M., & Zhou, M. (2014). A recursive recurrent neural network for statistical machine translation. Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics:1491-1500. Association for Computational Lin-guistics, Baltimore, MD, USA.
Luong, C. T., Sutskever, I., Le, Q. V., Vinyals, O. V., & Zaremba, W. 2015. Addressing the Rare Word Problem in Neural Machine Translation. arXiv preprint arXiv:1410.8206, URL http://arxiv.org/abs/1410. 8206.
Och, F. J. & Ney, Hermann. (2004). The Alignment Template Approach to Statistical Machine Translation. Computational Linguistics (4): 417-449.
Okpor, M. D. (2014). Machine Translation Approaches: Issues and Challenges. Computer science: 159-165.
Pham, V., Bluche, T., Kermorvant, C., & Louradour, J. (2013). Dropout improves Recurrent Neural Networks for Handwriting Recognition. IEEE Transactions on Signal Processing 45(11): 2673 -2681.
Poibeau, T. (2017). Machine Translation. London: MIT press.
Ruan, S-m. & Ma, Y. (2020). Real-Time Energy Management Strategy Based on Driver-Action-Impact MPC for Series Hybrid Electric Vehicles. Complexity. https://doi.org/10.1155/2020/8843168.
Sak, H., Senior, A. & Beaufays, F. (2014). Long Short-Term Memory Recurrent Neural Network Architectures for Large ScaleAcoustic Modeling. Proceedings of the Annual Conference of International Speech Communication Association (INTERSPEECH): 338-342.
Schmidhuber, J., Wierstra, H., Gagliolo, J., & Gomez, D. (2007). A new class of learning algorithms for supervised recurrent neural networks (RNNs). Neural Computation 19(3):757-79.
Song, L., Zhang, Y., Wang, Z., & Gildea, D. (2018). Nary relation extraction usinggraph-state lstm. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing: 2226-2235.
Suraiya, J., Samak, S., & Sokphyrum, Kim. (2018). How to Translate from English to Khmer using Moses. International Journal of Engineer Inventions (3): 71-81.
Sutskever, I., Vinyals, O., & Le, Q. (2014). Sequence to sequence learning with neural networks. In: Advances in neural information processing systems.
Wang, L., Cao, Z., Xia, Y., & De Melo, G. (2016). Morphological segmentation with window lstm neural networks. In: Thirtieth AAAI Conference on Artificial Intelligence.
Wang,T. W., Liu, X. L., & Long, W. (2018). An LSTM Approach to Short Text Sentiment Classification with Word Embeddings Jenq-Haur. The 2018 Conference on Computational Linguistics and Speech Processing: 214-223. The Association for Computational Linguistics and Chinese Language Processing.
Wu, Y., Schuster, M., Chen, Z., Le, Q.V., Norouzi, M., Macherey, W., Krikun, M., Cao, Y., Gao, Q., Macherey, K., et al. (2016). Google’s neural machine translation system: Bridging the gap between human and machine translation. Computation and Language, 1-23.
Yu, Y., Si, X., Hu, C., Zhang, J. (2019). A review of recurrent neural networks: Lstm cells and network architectures. Neural computation 31(7): 1235-1270.
Zaremba, W., Sutskever, I. & Vinyals, O. (2014). Recurrent Neural Network Regularization. arXiv:1409.2329 [cs], September URL http://arxiv.org/abs/1409. 2329.
Zhu. (2018). The Machine to Replace Human Beings as a Translaor? A Discussion of the Relationship Between Science and Humanity in the Cultivation of Translators and Interpreters. Foreign Language and Literature (bimonthly) 34(3):101-109.
A Review of Machine Translation: Implications to Human Translators and Translation Teaching
How to cite this paper: Xiaoping Shen. (2022). A Review of Machine Translation: Implications to Human Translators and Translation Teaching. The Educational Review, USA, 6(12), 869-874.
DOI: http://dx.doi.org/10.26855/er.2022.12.014