Title
Information Theoretical Measures for Achieving Robust Learning Machines.
Abstract
Information theoretical measures are used to design, from first principles, an objective function that can drive a learning machine process to a solution that is robust to perturbations in parameters. Full analytic derivations are given and tested with computational examples showing that indeed the procedure is successful. The final solution, implemented by a robust learning machine, expresses a balance between Shannon differential entropy and Fisher information. This is also surprising in being an analytical relation, given the purely numerical operations of the learning machine.
Year
DOI
Venue
2016
10.3390/e18080295
ENTROPY
Keywords
Field
DocType
information theoretical learning,Shannon entropy,Kullback-Leibler divergence,relative entropy,cross-entropy,Fisher information,relative information
Transfer entropy,Mathematical optimization,Rényi entropy,Information diagram,Shannon's source coding theorem,Joint entropy,Computational learning theory,Statistics,Entropy (information theory),Kullback–Leibler divergence,Mathematics
Journal
Volume
Issue
Citations 
18
8
1
PageRank 
References 
Authors
0.35
9
4
Name
Order
Citations
PageRank
Pablo Zegers1102.18
B. Roy Frieden2163.22
Carlos Alarcón330.76
Alexis Fuentes410.35