Optimal nonparametric inference via deep neural network

If you need an accessible version of this item, please submit a remediation request.
Date
2022-01
Language
American English
Embargo Lift Date
Committee Members
Degree
Degree Year
Department
Grantor
Journal Title
Journal ISSN
Volume Title
Found At
Elsevier
Abstract

Deep neural network is a state-of-art method in modern science and technology. Much statistical literature have been devoted to understanding its performance in nonparametric estimation, whereas the results are suboptimal due to a redundant logarithmic sacrifice. In this paper, we show that such log-factors are not necessary. We derive upper bounds for the L2 minimax risk in nonparametric estimation. Sufficient conditions on network architectures are provided such that the upper bounds become optimal (without log-sacrifice). Our proof relies on an explicitly constructed network estimator based on tensor product B-splines. We also derive asymptotic distributions for the constructed network and a relating hypothesis testing procedure. The testing procedure is further proved as minimax optimal under suitable network architectures.

Description
item.page.description.tableofcontents
item.page.relation.haspart
Cite As
Liu, R., Boukai, B., & Shang, Z. (2022). Optimal nonparametric inference via deep neural network. Journal of Mathematical Analysis and Applications, 505(2), 125561. https://doi.org/10.1016/j.jmaa.2021.125561
ISSN
0022-247X
Publisher
Series/Report
Sponsorship
Major
Extent
Identifier
Relation
Journal
Journal of Mathematical Analysis and Applications
Source
ArXiv
Alternative Title
Type
Article
Number
Volume
Conference Dates
Conference Host
Conference Location
Conference Name
Conference Panel
Conference Secretariat Location
Version
Author's manuscript
Full Text Available at
This item is under embargo {{howLong}}