We study the problem of (provably) learning the weights of a two-layer neural network with quadratic activations. We focus on the under-parametrized regime where the number of neurons in the hidden layer is smaller than the dimension of the input. Our main approach is to lift the learning problem into a higher dimension, which enables us to borrow algorithmic techniques from low-rank matrix estimation. Using this intuition, we propose three novel, non-convex training algorithms. We support our algorithms with rigorous theoretical analysis, and show that all three enjoy a linear convergence, fast running time per iteration, and near-optimal sample complexity.
PublicationsM. Soltani and C.Hegde, Fast and Provable Algorithms for Learning Two-Layer Polynomial Neural Networks. [Paper]
M. Soltani and C. Hegde, Towards Provable Learning of Polynomial Neural Networks Using Low-Rank Matrix Estimation, Artificial Intelligence and Statistics (AISTAT), April 2018. [Paper]