Title
Higher Order Fused Regularization for Supervised Learning with Grouped Parameters
Abstract
We often encounter situations in supervised learning where there exist possibly groups that consist of more than two parameters. For example, we might work on parameters that correspond to words expressing the same meaning, music pieces in the same genre, and books released in the same year. Based on such auxiliary information, we could suppose that parameters in a group have similar roles in a problem and similar values. In this paper, we propose the Higher Order Fused (HOF) regularization that can incorporate smoothness among parameters with group structures as prior knowledge in supervised learning. We define the HOF penalty as the Lovász extension of a submodular higher-order potential function, which encourages parameters in a group to take similar estimated values when used as a regularizer. Moreover, we develop an efficient network flow algorithm for calculating the proximity operator for the regularized problem. We investigate the empirical performance of the proposed algorithm by using synthetic and real-world data.
Year
DOI
Venue
2015
10.1007/978-3-319-23528-8_36
ECML/PKDD
DocType
Volume
ISSN
Conference
9284
0302-9743
Citations 
PageRank 
References 
2
0.37
16
Authors
3
Name
Order
Citations
PageRank
Koh Takeuchi15911.29
Kawahara, Yoshinobu231731.30
Tomoharu Iwata382465.87