Functions Space View of Linear Multi-Channel Convolution Networks With Bounded Weight Norm

Friday, April 9, 2021 - 11:00am to 11:55am

Event Calendar Category

SDSC

Speaker Name

Suriya Gunasekar

Affiliation

Microsoft Research

Zoom meeting id

958 4002 6245

Join Zoom meeting

https://mit.zoom.us/j/95840026245

Abstract

The magnitude of the weights of a neural network is a fundamental measure of complexity that plays a crucial role in the study of implicit and explicit regularization. For example, in recent work, gradient descent updates in overparameterized models asymptotically lead to solutions that implicitly minimize the ell_2 norm of the parameters of the model, resulting in an inductive bias that is highly architecture dependent. To investigate the properties of learned functions, it is natural to consider a function space view given by the minimum ell_2 norm of weights required to realize a given function with a given network. We call this the “induced regularizer” of the network. Building on a line of recent work, we study the induced regularizer of linear convolutional neural networks with a focus on the role of kernel size and the number of channels. We introduce an SDP relaxation of the induced regularizer, that we show is tight for networks with a single input channel. Using this SDP formulation, we show that the induced regularizer is independent of the number of the output channels for single-input channel networks, and for multi-input channel networks, we show independence given sufficiently many output channels. Moreover, we show that as the kernel size increases, the induced regularizer interpolates between a basis-invariant norm and a basis-dependent norm that promotes sparse structures in Fourier space.

Based on joint work with Meena Jagadeesan and Ilya Razenshteyn.

 

Biography

Prior to joining MSR, Suriya Gunasekar was a Research Assistant Professor at Toyota Technological Institute at Chicago. She received her PhD in ECE from The University of Texas at Austin.