توضیحاتی در مورد کتاب Machine Translation 16th China Conference, CCMT 2020, Hohhot, China, October 10-12, 2020, Revised Selected Papers
نام کتاب : Machine Translation 16th China Conference, CCMT 2020, Hohhot, China, October 10-12, 2020, Revised Selected Papers
ویرایش : 1st ed. 2020.
عنوان ترجمه شده به فارسی : ترجمه ماشینی شانزدهمین کنفرانس چین، CCMT 2020، هوهات، چین، 10 تا 12 اکتبر 2020، مقالات منتخب اصلاح شده
سری :
نویسندگان : Junhui Li (editor), Andy Way (editor)
ناشر :
سال نشر : 2021
تعداد صفحات : 154
ISBN (شابک) : 9789813361614 , 981336162X
زبان کتاب : English
فرمت کتاب : pdf
حجم کتاب : 11 مگابایت
بعد از تکمیل فرایند پرداخت لینک دانلود کتاب ارائه خواهد شد. درصورت ثبت نام و ورود به حساب کاربری خود قادر خواهید بود لیست کتاب های خریداری شده را مشاهده فرمایید.
فهرست مطالب :
Preface\nOrganization\nContents\nTransfer Learning for Chinese-Lao Neural Machine Translation with Linguistic Similarity\n Abstract\n 1 Introduction\n 2 Linguistic Similarity Between Thai and Lao\n 3 Our Approach\n 3.1 Chinese-Thai NMT Model\n 3.2 Thai-Lao NMT Model\n 3.3 Chinese-Lao NMT Model\n 4 Evaluation\n 4.1 Experimental Setup\n 4.2 Experimental Results\n 5 Related Work\n 6 Conclusions\n Acknowledgements\n References\nMTNER: A Corpus for Mongolian Tourism Named Entity Recognition\n Abstract\n 1 Introduction\n 2 Related Work\n 3 Challenge for Mongolian Tourism NER\n 4 Annotated Mongolian Tourism Corpus\n 4.1 Data Collection\n 4.2 Annotation Schema\n 4.3 Annotation Agreement\n 5 Mongolian Tourism NER Model\n 6 Experiment\n 6.1 Data\n 6.2 Baselines\n 6.3 Results\n 6.4 Analysis\n 7 Conclusion\n Acknowledgement\n References\nUnsupervised Machine Translation Quality Estimation in Black-Box Setting\n Abstract\n 1 Introduction\n 2 Background\n 2.1 Machine Translation Evaluation\n 2.2 Machine Translation Quality Estimation\n 3 Model Description\n 3.1 Pretrained Models for Quality Estimation\n 3.2 MTE-Based QE Data\n 4 Experiment\n 4.1 Setup\n 4.2 Experiment Results\n 5 Analysis\n 5.1 Is BERT Always the Best?\n 5.2 Is Black-Box Model Necessary?\n 5.3 Where Is the Limitation of QE?\n 6 Conclusion\n Acknowledgement\n References\nYuQ: A Chinese-Uyghur Medical-Domain Neural Machine Translation Dataset Towards Knowledge-Driven\n 1 Introduction\n 2 Related Work\n 3 Datasets\n 3.1 Data Collection\n 3.2 Corpus Preprocessing\n 3.3 Annotation\n 3.4 Knowledge Graph Construction\n 4 Corpus Analysis\n 4.1 Lexical Feature Analysis\n 4.2 Contrastive Analysis of Lexical Features\n 5 Experiments\n 5.1 Models\n 5.2 Setup\n 5.3 Automatic Evaluation\n 5.4 Manual Evaluation\n 5.5 Metrics\n 5.6 Annotation Statistics\n 5.7 Results\n 5.8 Case Study\n 5.9 Ablation Study\n 6 Conclusion and Future Work\n References\nQuality Estimation for Machine Translation with Multi-granularity Interaction\n 1 Introduction\n 2 Related Work\n 3 Methodology\n 3.1 Model Architecture\n 3.2 Multi-granularity Interaction\n 3.3 Model Training\n 4 Experiments\n 4.1 Dataset\n 4.2 Experimental Setup\n 4.3 Experimental Result\n 4.4 Word-Level Feature Analysis\n 5 Conclusion\n References\nTransformer-Based Unified Neural Network for Quality Estimation and Transformer-Based Re-decoding Model for Machine Translation\n 1 Introduction\n 2 Model\n 2.1 Transformer-Based Unified Neural Network for the Quality Estimation of Machine Translation\n 2.2 Study of Re-decoding-Based Neural Machine Translation\n 3 Experiment\n 3.1 Setting\n 3.2 Results\n 3.3 Analysis\n 4 Conclusions\n References\nNJUNLP\'s Machine Translation System for CCMT-2020 Uighur Chinese Translation Task\n 1 Introduction\n 2 Machine Translation System\n 2.1 Pre-processing\n 2.2 Architecture\n 2.3 Back-Translation of Monolingual Data\n 2.4 Fine-Tuning\n 2.5 Ensemble Translation\n 2.6 Reranking\n 3 Results\n 4 Conclusion\n References\nDescription and Findings of OPPO\'s Machine Translation Systems for CCMT 2020\n 1 Introduction\n 2 Applying Multiple Word Segmentation Tools\n 3 English Chinese Machine Translation Task\n 3.1 Data Preprocessing\n 3.2 Model Training\n 3.3 Corpus Filtering Task\n 4 Japanese English Translation Task (Patent Domain)\n 4.1 Data Preprocessing\n 4.2 Model Training\n 5 Minority Languages Mandarin Translation Task\n 5.1 Data Preprocessing\n 5.2 Model Training\n 6 Conclusion and Future Work\n References\nTsinghua University Neural Machine Translation Systems for CCMT 2020\n 1 Introduction\n 2 Methods\n 2.1 Data\n 2.2 Models\n 2.3 Data Augmentation\n 2.4 Finetuning\n 2.5 Ensemble\n 3 Experiments\n 3.1 Settings\n 3.2 Results on Chinese-English Translation\n 3.3 Results on English-Chinese Translation\n 4 Conclusion\n References\nBJTU’s Submission to CCMT 2020 Quality Estimation Task\n Abstract\n 1 Introduction\n 2 Model Description\n 2.1 Pretrained Models for Quality Estimation\n 2.2 Further Pretraining for Bilingual Input\n 2.3 Multi-task Learning for Multi-granularities\n 2.4 Weighted Loss for Unbalanced Word Labels\n 2.5 Multi-model Ensemble\n 3 Experiment\n 3.1 Dataset\n 3.2 Experiment Results\n 3.3 Ablation Study\n 4 Conclusion\n Acknowledgement\n References\nNJUNLP\'s Submission for CCMT20 Quality Estimation Task\n 1 Introduction\n 2 Methods\n 2.1 Existing Methods\n 2.2 Proposed Methods\n 3 Experiments\n 3.1 Dataset\n 3.2 Settings\n 3.3 Single Model Results\n 3.4 Ensemble\n 4 Analysis\n 5 Conclusion\n References\nTencent Submissions for the CCMT 2020 Quality Estimation Task\n 1 Introduction\n 2 Architecture\n 2.1 Predictors\n 2.2 Estimators\n 2.3 Ensemble\n 3 Experiments and Results\n 3.1 Dataset\n 3.2 Experiments\n 4 Conclusion\n References\nNeural Machine Translation Based on Back-Translation for Multilingual Translation Evaluation Task\n Abstract\n 1 Introduction\n 2 Related Work\n 3 Model\n 3.1 Transformer-Base\n 3.2 Transformer-Big\n 3.3 Dynamic-Conv\n 4 Experiments\n 4.1 Preprocessing\n 4.2 Back Translation Based Synthetic Data\n 4.3 Multi-model Ensemble\n 4.4 Contrast Experiments\n 4.5 Results\n 4.6 Model Analysis and Discussion\n 5 Conclusion and Future Work\n Acknowledgement\n References\nAuthor Index