In this paper, we consider the linear ill-posed inverse problem with noisy data in the statistical learning setting. The Tikhonov regularization scheme in Hilbert scales is considered in the reproducing kernel Hilbert space framework to reconstruct the estimator from the random noisy data. We discuss the rates of convergence for the regularized solution under the prior assumptions and link condition. For regression functions with smoothness given in terms of source conditions the error bound can explicitly be established.
REFERENCES
1.
Nachman
Aronszajn
. Theory of reproducing kernels
. Trans. Am. Math. Soc.
, 68
:337
–404
, 1950
.2.
Gilles
Blanchard
and Nicole
Mucke
. Optimal rates for regularization of statistical inverse learning problems
. Found. Comput. Math.
, 18
(4
):971
–1013
, 2018
.3.
Albrecht
Bottcher
, Bernd
Hofmann
, Ulrich
Tautenhahn
, and Masahiro
Yamamoto
. Convergence rates for Tikhonov regularization from different kinds of smoothness conditions
. Appl. Anal.
, 85
(5
):555
–578
, 2006
.4.
Heinz W.
Engl
, Martin
Hanke
, and Andreas
Neubauer
. Regularization of inverseproblems, volume 375
. Math. Appl
., Kluwer Academic Publishers Group
, Dordrecht, The Netherlands
, 1996
.5.
Peter
Mathe
and Ulrich
Tautenhahn
. Interpolation in variable Hilbert scales with application to inverse problems
. Inverse Probl.
, 22
(6
):2271
–2297
, 2006
.6.
Peter
Mathe
and Ulrich
Tautenhahn
. Error bounds for regularization methods in Hilbert scales by using operator monotonicity
. Far EastJ. Math. Sci.
, 24
(1
):1
, 2007
.7.
M. Thamban
Nair
, Sergei V.
Pereverzev
, and Ulrich
Tautenhahn
. Regularization in Hilbert scales under general smoothing conditions
. Inverse Probl.
, 21
(6
):1851
–1869
, 2005
.
This content is only available via PDF.
© 2019 Author(s).
2019
Author(s)