Title
Abstractive Text Classification Using Sequence-to-convolution Neural Networks.
Abstract
We propose a new deep neural network model and its training scheme for text classification. Our model Sequence-to-convolution Neural Networks(Seq2CNN) consists of two blocks: Sequential Block that summarizes input texts and Convolution Block that receives summary of input and classifies it to a label. Seq2CNN is trained end-to-end to classify various-length texts without preprocessing inputs into fixed length. We also present Gradual Weight Shift(GWS) method that stabilizes training. GWS is applied to our modelu0027s loss function. We compared our model with word-based TextCNN trained with different data preprocessing methods. We obtained significant improvement in classification accuracy over word-based TextCNN without any ensemble or data augmentation.
Year
Venue
Field
2018
arXiv: Computation and Language
Pattern recognition,Computer science,Convolution,Data pre-processing,Preprocessor,Artificial intelligence,Artificial neural network,Machine learning
DocType
Volume
Citations 
Journal
abs/1805.07745
0
PageRank 
References 
Authors
0.34
12
2
Name
Order
Citations
PageRank
Taehoon Kim1437.51
Jihoon Yang21438.59