Title | ||
---|---|---|
Advances on BYY harmony learning: information theoretic perspective, generalized projection geometry, and independent factor autodetermination. |
Abstract | ||
---|---|---|
The nature of Bayesian Ying-Yang harmony learning is reexamined from an information theoretic perspective. Not only its ability for model selection and regularization is explained with new insights, but also discussions are made on its relations and differences from the studies of minimum description length (MDL), Bayesian approach, the bit-back based MDL, Akaike information criterion (AIC), maximum likelihood, information geometry, Helmholtz machines, and variational approximation. Moreover, a generalized projection geometry is introduced for further understanding such a new mechanism. Furthermore, new algorithms are also developed for implementing Gaussian factor analysis (FA) and non-Gaussian factor analysis (NFA) such that selecting appropriate factors is automatically made during parameter learning. |
Year | DOI | Venue |
---|---|---|
2004 | 10.1109/TNN.2004.828767 | IEEE Transactions on Neural Networks |
Keywords | Field | DocType |
byy harmony learning,bayesian ying-yang harmony learning,index terms—automatic model selection,gaussian factor analysis,bayesian approach,minimum description length,bayesian ying-yang byy system,information theoretic,information theoretic perspective,non-gaussian factors,appropriate factor,harmony learning,generalized projection geometry,new insight,new mechanism,independent factor autodetermination,bayesian,bit-back,new algorithm,projection geometry.,information geometry,akaike information criterion,factor analysis,helmholtz equations,projective geometry,principal component analysis,maximum likelihood method,information theory,learning artificial intelligence,maximum likelihood estimation,bayesian methods,maximum likelihood,algorithm design and analysis,solid modeling,indexing terms,model selection,algorithms,independent component analysis,computer simulation,machine learning,predictive models,bayes theorem,artificial intelligence | Computer science,Regularization (mathematics),Artificial intelligence,Geometry,Bayes' theorem,Information theory,Information geometry,Akaike information criterion,Pattern recognition,Minimum description length,Model selection,Machine learning,Bayesian probability | Journal |
Volume | Issue | ISSN |
15 | 4 | 1045-9227 |
Citations | PageRank | References |
29 | 1.55 | 28 |
Authors | ||
1 |