Name
Papers
Collaborators
YIMING CUI
32
86
Citations 
PageRank 
Referers 
87
13.40
322
Referees 
References 
396
167
Search Limit
100396
Title
Citations
PageRank
Year
CINO: A Chinese Minority Pre-trained Language Model.00.342022
Interactive Gated Decoder for Machine Reading Comprehension00.342022
TextPruner: A Model Pruning Toolkit for Pre-Trained Language Models00.342022
HFL at SemEval-2022 Task 8: A Linguistics-inspired Regression Model with Data Augmentation for Multilingual News Similarity.00.342022
Teaching Machines to Read, Answer and Explain00.342022
HIT at SemEval-2022 Task 2: Pre-trained Language Model for Idioms Detection.00.342022
Benchmarking Robustness of Machine Reading Comprehension Models.00.342021
Pre-Training With Whole Word Masking for Chinese BERT40.462021
Adversarial Training For Machine Reading Comprehension With Virtual Embeddings00.342021
Revisiting Pre-Trained Models for Chinese Natural Language Processing00.342020
Textbrewer: An Open-Source Knowledge Distillation Toolkit For Natural Language Processing00.342020
CLUE - A Chinese Language Understanding Evaluation Benchmark.00.342020
Recall and Learn: Fine-tuning Deep Pretrained Language Models with Less Forgetting.00.342020
CharBERT: Character-aware Pre-trained Language Model00.342020
Is Graph Structure Necessary for Multi-hop Question Answering?00.342020
Discriminative Sentence Modeling For Story Ending Prediction00.342020
A Sentence Cloze Dataset for Chinese Machine Reading Comprehension00.342020
TripleNet: Triple Attention Network for Multi-Turn Response Selection in Retrieval-Based Chatbots00.342019
Exploiting Persona Information for Diverse Generation of Conversational Responses.40.412019
A Span-Extraction Dataset for Chinese Machine Reading Comprehension10.392019
Context-Sensitive Generation of Open-Domain Conversational Responses.10.352018
Dataset for the First Evaluation on Chinese Machine Reading Comprehension.10.352018
Convolutional Spatial Attention Model for Reading Comprehension with Multiple-Choice Questions50.492018
HFL-RC System at SemEval-2018 Task 11: Hybrid Multi-Aspects Model for Commonsense Reading Comprehension.30.402018
Attention-Over-Attention Neural Networks For Reading Comprehension461.642017
Consensus Attention-based Neural Networks for Chinese Reading Comprehension.171.012016
LSTM Neural Reordering Feature for Statistical Machine Translation40.442015
Context-extended phrase reordering model for pivot-based statistical machine translation00.342015
Augmenting Phrase Table by Employing Lexicons for Pivot-based SMT00.342015
The USTC machine translation system for IWSLT 2014.00.342014
Phrase Table Combination Deficiency Analyses in Pivot-Based SMT.10.362013
The HIT-LTRC machine translation system for IWSLT 2012.00.342012