Title
A Topological Framework for Training Latent Variable Models
Abstract
We discuss the properties of a class of latent variable models that assumes each labeled sample is associated with a set of different features, with no prior knowledge of which feature is the most relevant feature to be used. Deformable-Part Models (DPM) can be seen as good examples of such models. These models are usually considered to be expensive to train and very sensitive to the initialization. In this paper, we focus on the learning of such models by introducing a topological framework and show how it is possible to both reduce the learning complexity and produce more robust decision boundaries. We will also argue how our framework can be used for producing robust decision boundaries without exploiting the dataset bias or relying on accurate annotations. To experimentally evaluate our method and compare with previously published frameworks, we focus on the problem of image classification with object localization. In this problem, the correct location of the objects is unknown, during both training and testing stages, and is considered as a latent variable.
Year
DOI
Venue
2014
10.1109/ICPR.2014.427
Pattern Recognition
Keywords
Field
DocType
image classification,statistical analysis,DPM,deformable-part models,image classification,latent variable models,learning complexity,object localization,robust decision boundaries,topological framework
Data mining,Topology,Computer science,Latent variable,Artificial intelligence,Initialization,Contextual image classification,Machine learning,Learning complexity
Conference
ISSN
Citations 
PageRank 
1051-4651
0
0.34
References 
Authors
9
3
Name
Order
Citations
PageRank
Heydar Maboudi Afkham182.90
carl henrik ek232730.76
Stefan Carlsson321.38