Abstract | ||
---|---|---|
We are living in an era where data is enjoying an unprecedented increase in its volume in each passing moment through online media platforms. Such a colossal amount of data is multifarious in its nature where textual data proves to be its vital pillar. Almost every sort of online media platform is producing textual data. Short posts (i.e. Twitter and Facebook) and comments constitute a significant part of this textual data. Unfortunately, this text data may contain overlapping toxic sentiments in terms of personal attacks, abuses, obscenity, insults, threats or identity hatred. In many cases, it becomes extremely important to track such toxic posts/data to trigger needed actions e.g. automated tagging of posts as inappropriate. Stateof-the-art classification techniques do not handle the overlapping sentiment categories of text data. In this paper, we propose Deep Neural Network (DNN) architectures to classify the overlapping sentiments with high accuracy. Moreover, we show that our proposed classification framework does not require any laborious text pre-processing and is capable of handling text pre-processing (e.g. stop word removal, feature engineering, etc.) intrinsically. Our empirical validation on a real world dataset supports our claims by showing the superior performance of the proposed methods. |
Year | DOI | Venue |
---|---|---|
2018 | 10.1109/ICDMW.2018.00193 | 2018 18TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW) |
Keywords | Field | DocType |
Toxic comments, Focal Loss, Text Preprocessing, CNN, Bi-GRU, Bi-LSTM | Kernel (linear algebra),Information retrieval,Computer science,sort,Feature engineering,Artificial intelligence,Artificial neural network,Digital media,Stop words,Machine learning,Hatred,Encoding (memory) | Conference |
ISSN | Citations | PageRank |
2375-9232 | 1 | 0.35 |
References | Authors | |
0 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Hafiz Hassaan Saeed | 1 | 1 | 0.35 |
Khurram Shahzad | 2 | 165 | 25.77 |
Faisal Kamiran | 3 | 229 | 20.56 |