PI: Hongcheng Liu, Co-PI: Panos Pardalos
Award Period: 05/01/2019-06/30/2020
Abstract
Neural network (NN) is the pillar machine learning model for the realization of modern artificial intelligence. Despite the frequent advancement in the NN-related algorithms, models, and applications, the development of their theoretical underpinnings are lagging behand. For most existing theoretical generalization analysis, the number of samples required to ensure proper out-of-sample performance is stipulated to grow rapidly in polynomial of the number of fitting parameters and, thus, the depth of a NN. This is inconsistent with the practical observations, as modern neural network models are commonly over-parameterized. Furthermore, most existing NN models are hardly interpretable due to the intrinsic nonlinearity and nonconvexity. Through this project, we will theoretically analyze both the generalizability and the interpretability for a neural network in a general setting. The new bound will ensure the generalization performance of a NN to be insensitive to the increase of fitting parameters. Furthermore, regularization schemes will be incorporated into the training of a NN to make available a likelihood ratio-based statistical test, allowing the NN to be better interpretable. This project will lay foundation for the future research on artificial intelligence and global optimization by providing a machinery to comprehend a class of machine learning models. It will also lead to further development of a project relevant to one of NSF’s 10 big ideas: Harnessing Data for the 21st Century. At completion of this project, joint-author papers and proposals to NSF will be submitted.
Publications
- Seonho Park, Seung Hyun Jung, Panos Pardalos. Combining Stochastic Adaptive Cubic Regularization with Negative Curvature for Nonconvex Optimization, Journal of Optimization Theory and Application 184(3): 953-971, (March 2020).
- Stamatios-Aggelos N. Alexandropoulos, Panos M. Pardalos, Michael N. Vrahatis:
Dynamic search trajectory methods for global optimization. Ann. Math. Artif. Intell. 88(1): 3-37 (2020) - Hongcheng Liu, Yinyu Ye. (Under Review).High-Dimensional Learning under Approximate Sparsity: Towards a Unified Framework for Nonsmooth Learning and Regularized Neural Networks. Major revision by Operations Research. (2020)
- Qingchao Zhang, Yunmei Chen, Hongcheng Liu, Xiaojing Ye. (Under Review). A Novel Learnable Gradient Descent Type Algorithm for Non-convex Non-smooth Inverse Problems. Under review by ECCV.
- Bijan Taslimi, Yuanbo Wang, Hongcheng Liu, Panos Pardalos. (In Progress). A Lasserre Hierarchy-based global optimization scheme for training neural networks. Working paper.
Proposals
- Sponsor: Air Force Office of Scientific Research
Title: Transforming Training Paradigms for Artificial Intelligence (White Paper)
PI: Hongcheng Liu, Co-PI: Panos Pardalos - Sponsor: Office of Naval Research
Title: A Novel Interior-Point Trust-Region Paradigm for Optimization under Conic Constraints (White Paper)
PI: Hongcheng Liu - Sponsor: UF Office of Research
Title: OR: DRPD-ROF2020: Generalizable Deep Neural Nets and Deep Hypothesis Testing for Biomarker Identification and Disease Diagnosis
PI: Hongcheng Liu, Co-PI: Panos Pardalos - Sponsor: National Science Foundation
Title: AF: Small: Data-Driven Optimization Under Partial Knowledge and Data Insufficiency
PI: Hongcheng Liu, Co-PI: Yunmei Chen