Title
Dimensionality Reduction Method'S Comparison Based On Statistical Dependencies
Abstract
The field of machine learning deals with a huge amount of various algorithms, which are able to transform the observed data into many forms and dimensionality reduction (DR) is one of such transformations. There are many high quality papers which compares some of the DR's approaches and of course there other experiments which applies them with success. Not everyone is focused on information lost, increase of relevance or decrease of uncertainty during the transformation, which is hard to estimate and only few studies remark it briefly. This study aims to explain these inner features of four different DR's algorithms. These algorithms were not chosen randomly, but in purpose. It is chosen some representative from all of the major DR's groups. The comparison criteria are based on statistical dependencies, such as Correlation Coefficient, Euclidean Distance, Mutual Information and Granger causality. The winning algorithm should reasonably transform the input dataset with keeping the most of the inner dependencies. (C) 2016 The Authors. Published by Elsevier B.V.
Year
DOI
Venue
2016
10.1016/j.procs.2016.04.218
7TH INTERNATIONAL CONFERENCE ON AMBIENT SYSTEMS, NETWORKS AND TECHNOLOGIES (ANT 2016) / THE 6TH INTERNATIONAL CONFERENCE ON SUSTAINABLE ENERGY INFORMATION TECHNOLOGY (SEIT-2016) / AFFILIATED WORKSHOPS
Keywords
Field
DocType
Principal Component Analysis, Non-negative Matrix Factorization, Autoencoder, Neighborhood Preserving Embedding, Granger Causality, Mutual Information
Data mining,Correlation coefficient,Dimensionality reduction,Computer science,Granger causality,Euclidean distance,Mutual information,Artificial intelligence,Machine learning
Conference
Volume
ISSN
Citations 
83
1877-0509
0
PageRank 
References 
Authors
0.34
5
3
Name
Order
Citations
PageRank
tomas vantuch103.38
Václav Snasel21261210.53
Ivan Zelinka345182.16