Abstract | ||
---|---|---|
Piecewise linear models (PLMs) have been widely used in many enterprise machine learning problems, which assign linear experts to individual partitions on feature spaces and express whole models as patches of local experts. This paper addresses simultaneous model selection issues of PLMs; partition structure determination and feature selection of individual experts. Our contributions are mainly three-fold. First, we extend factorized asymptotic Bayesian (FAB) inference for hierarchical mixtures of experts (probabilistic PLMs). FAB inference offers penalty terms w.r.t. partition and expert complexities, and enable us to resolve the model selection issue. Second, we propose posterior optimization which significantly improves predictive accuracy. Roughly speaking, our new posterior optimization mitigates accuracy degradation due to a gap between marginal log-likelihood maximization and predictive accuracy. Third, we present an application of energy demand forecasting as well as benchmark comparisons. The experiments show our capability of acquiring compact and highly-accurate models. |
Year | Venue | Field |
---|---|---|
2014 | JMLR Workshop and Conference Proceedings | Mathematical optimization,Feature selection,Linear model,Inference,Computer science,Model selection,Artificial intelligence,Partition (number theory),Machine learning,Piecewise linear model,Piecewise,Bayesian probability |
DocType | Volume | ISSN |
Conference | 33 | 1938-7288 |
Citations | PageRank | References |
1 | 0.36 | 13 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Riki Eto | 1 | 1 | 0.36 |
Ryohei Fujimaki | 2 | 193 | 16.93 |
Satoshi Morinaga | 3 | 288 | 46.89 |
Hiroshi Tamano | 4 | 8 | 1.17 |