| | |
| | |
Stat |
Members: 3645 Articles: 2'503'724 Articles rated: 2609
23 April 2024 |
|
| | | |
|
Article overview
| |
|
Multivariate extensions of isotonic regression and total variation denoising via entire monotonicity and Hardy-Krause variation | Billy Fang
; Adityanand Guntuboyina
; Bodhisattva Sen
; | Date: |
4 Mar 2019 | Abstract: | We consider the problem of nonparametric regression when the covariate is
$d$-dimensional, where $d geq 1$. In this paper we introduce and study two
nonparametric least squares estimators (LSEs) in this setting---the entirely
monotonic LSE and the constrained Hardy-Krause variation LSE. We show that
these two LSEs are natural generalizations of univariate isotonic regression
and univariate total variation denoising, respectively, to multiple dimensions.
We discuss the characterization and computation of these two LSEs obtained from
$n$ data points. We provide a detailed study of their risk properties under the
squared error loss and fixed uniform lattice design. We show that the finite
sample risk of these LSEs is always bounded from above by $n^{-2/3}$ modulo
logarithmic factors depending on $d$; thus these nonparametric LSEs avoid the
curse of dimensionality to some extent. For the case of the Hardy-Krause
variation LSE, we also show that logarithmic factors which increase with $d$
are necessary in the risk upper bound by proving a minimax lower bound.
Further, we illustrate that these LSEs are particularly useful in fitting
rectangular piecewise constant functions. Specifically, we show that the risk
of the entirely monotonic LSE is almost parametric (at most $1/n$ up to
logarithmic factors) when the true function is well-approximable by a
rectangular piecewise constant entirely monotone function with not too many
constant pieces. A similar result is also shown to hold for the constrained
Hardy-Krause variation LSE for a simple subclass of rectangular piecewise
constant functions. We believe that the proposed LSEs yield a novel approach to
estimating multivariate functions using convex optimization that avoid the
curse of dimensionality to some extent. | Source: | arXiv, 1903.1395 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
browser Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; ClaudeBot/1.0; +claudebot@anthropic.com)
|
| |
|
|
|
| News, job offers and information for researchers and scientists:
| |