Title
Twice-Universal Piecewise Linear Regression Via Infinite Depth Context Trees
Abstract
We investigate the problem of sequential piecewise linear regression from a competitive framework. For an arbitrary and unknown data length n, we first introduce a method to partition the regressor space. Particularly, we present a recursive method that divides the regressor space into O(n) disjoint regions that can result in approximately 1.5(n) different piecewise linear models on the regressor space. For each region, we introduce a universal linear regressor whose performance is nearly as well as the best linear regressor whose parameters are set non-causally. We then use an infinite depth context tree to represent all piecewise linear models and introduce a universal algorithm to achieve the performance of the best piecewise linear model that can be selected in hindsight. In this sense, the introduced algorithm is twice-universal such that it sequentially achieves the performance of the best model that uses the optimal regression parameters. Our algorithm achieves this performance only with a computational complexity upper bounded by O(n) in the worst-case and O(log(n)) under certain regularity conditions. We provide the explicit description of the algorithm as well as the upper bounds on the regret with respect to the best nonlinear and piecewise linear models, and demonstrate the performance of the algorithm through simulations.
Year
Venue
Keywords
2015
2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP)
Sequential, nonlinear, piecewise linear, regression, infinite depth context tree
Field
DocType
ISSN
Mathematical optimization,Disjoint sets,Linear model,Proper linear model,Piecewise linear manifold,Piecewise linear function,Mathematics,Segmented regression,Bounded function,Computational complexity theory
Conference
1520-6149
Citations 
PageRank 
References 
0
0.34
8
Authors
4
Name
Order
Citations
PageRank
Nuri Denizcan Vanli1776.77
Muhammed O. Sayin23914.04
Tolga Goze300.68
Suleyman Serdar Kozat412131.32