Title
Rotation Invariant Householder Parameterization for Bayesian PCA.
Abstract
We consider probabilistic PCA and related factor models from a Bayesian perspective. These models are in general not identifiable as the likelihood has a rotational symmetry. This gives rise to complicated posterior distributions with continuous subspaces of equal density and thus hinders efficiency of inference as well as interpretation of obtained parameters. In particular, posterior averages over factor loadings become meaningless and only model predictions are unambiguous. Here, we propose a parameterization based on Householder transformations, which remove the rotational symmetry of the posterior. Furthermore, by relying on results from random matrix theory, we establish the parameter distribution which leaves the model unchanged compared to the original rotationally symmetric formulation. In particular, we avoid the need to compute the Jacobian determinant of the parameter transformation. This allows us to efficiently implement probabilistic PCA in a rotation invariant fashion in any state of the art toolbox. Here, we implemented our model in the probabilistic programming language Stan and illustrate it on several examples.
Year
Venue
Field
2019
international conference on machine learning
Parametrization,Pattern recognition,Computer science,Artificial intelligence,Invariant (mathematics),Bayesian probability
DocType
Volume
Citations 
Journal
abs/1905.04720
0
PageRank 
References 
Authors
0.34
0
2
Name
Order
Citations
PageRank
Rajbir-Singh Nirwan100.34
Nils Bertschinger222521.10