A modified discrepancy principle to attain optimal convergence rates under unknown noise
- We consider a linear ill-posed equation in the Hilbert space setting. Multiple independent unbiased measurements of the right-hand side are available. A natural approach is to take the average of the measurements as an approximation of the right-hand side and to estimate the data error as the inverse of the square root of the number of measurements. We calculate the optimal convergence rate (as the number of measurements tends to infinity) under classical source conditions and introduce a modified discrepancy principle, which asymptotically attains this rate.
Author: | Tim Nikolas JahnORCiDGND |
---|---|
URN: | urn:nbn:de:hebis:30:3-631838 |
DOI: | https://doi.org/10.1088/1361-6420/ac1775 |
ISSN: | 1361-6420 |
Parent Title (English): | Inverse problems |
Publisher: | Inst. |
Place of publication: | Bristol [u.a.] |
Document Type: | Article |
Language: | English |
Date of Publication (online): | 2021/08/11 |
Date of first Publication: | 2021/08/11 |
Publishing Institution: | Universitätsbibliothek Johann Christian Senckenberg |
Release Date: | 2024/05/08 |
Tag: | convergence; discrepancy principle; optimality; spectral cut-off; statistical inverse problems |
Volume: | 37 |
Issue: | 9, art. 095008 |
Article Number: | 095008 |
Page Number: | 23 |
First Page: | 1 |
Last Page: | 23 |
HeBIS-PPN: | 519273036 |
Institutes: | Informatik und Mathematik |
Dewey Decimal Classification: | 5 Naturwissenschaften und Mathematik / 51 Mathematik / 510 Mathematik |
Sammlungen: | Universitätspublikationen |
Licence (German): | Creative Commons - Namensnennung 4.0 |