Learned Initializations for Optimizing Coordinate-Based Neural Representations
Project Page | Paper
Matthew Tancik*1,
Ben Mildenhall*1,
Terrance Wang1,
Divi Schmidt1,
Pratul P. Srinivasan2,
Jonathan T. Barron2,
Ren Ng1
1UC Berkeley, 2Google Research *denotes equal contribution
Abstract
Coordinate-based neural representations have shown significant promise as an alternative to discrete, array-based representations for complex low dimensional signals. However, optimizing a coordinate-based network from randomly initialized weights for each new signal is inefficient. We propose applying standard meta-learning algorithms to learn the initial weight parameters for these fully-connected networks based on the underlying class of signals being represented (e.g., images of faces or 3D models of chairs). Despite requiring only a minor change in implementation, using these learned initial weights enables faster convergence during optimization and can serve as a strong prior over the signal class being modeled, resulting in better generalization when only partial observations of a given signal are available.
Code
We provide a demo IPython notebook as a simple reference for the core idea. Scripts for the different tasks are located in the Experiments directory.