Title
kNN estimation of the unilateral dependency measure between random variables
Abstract
The informational energy (IE) can be interpreted as a measure of average certainty. In previous work, we have introduced a non-parametric asymptotically unbiased and consistent estimator of the IE. Our method was based on the kth nearest neighbor (kNN) method, and it can be applied to both continuous and discrete spaces, meaning that we can use it both in classification and regression algorithms. Based on the IE, we have introduced a unilateral dependency measure between random variables. In the present paper, we show how to estimate this unilateral dependency measure from an available sample set of discrete or continuous variables, using the kNN and the naïve histogram estimators. We experimentally compare the two estimators. Then, in a real-world application, we apply the kNN and the histogram estimators to approximate the unilateral dependency between random variables which describe the temperatures of sensors placed in a refrigerating room.
Year
DOI
Venue
2014
10.1109/CIDM.2014.7008705
Computational Intelligence and Data Mining
Keywords
Field
DocType
pattern classification,regression analysis,IE,classification algorithms,informational energy,kth nearest neighbor,kNN estimation,kNN method,naïve histogram estimators,random variables,regression algorithms,unilateral dependency
k-nearest neighbors algorithm,Histogram,Random variable,Pattern recognition,Regression,Continuous variable,Artificial intelligence,Probability density function,Machine learning,Mathematics,Estimator,Consistent estimator
Conference
Citations 
PageRank 
References 
1
0.39
4
Authors
3
Name
Order
Citations
PageRank
Angel Cataron1123.31
Razvan Andonie211717.71
Yvonne Chueh331.53