Skip to yearly menu bar Skip to main content


Poster

Neural Metamorphosis

Xingyi Yang · Xinchao Wang

Strong blind review: This paper was not made available on public preprint services during the review process Strong Double Blind
[ ]
Tue 1 Oct 7:30 a.m. PDT — 9:30 a.m. PDT

Abstract:

This paper introduces a novel paradigm termed \textbf{Neural Metamorphosis}~(\textbf{NeuMeta}), which aims to represent a continuous family of networks within a single versatile model. Unlike traditional methods that rely on separate models for different network tasks or sizes, NeuMeta enables an expansive continuum of neural networks that readily morph to fit various needs. The core mechanism is to train a neural implicit function that takes the desired network size and parameter coordinates as inputs, and generates exact corresponding weight values without requiring separate models for different configurations. Specifically, to achieve weight smoothness in a single model, we address the Shortest Hamiltonian Path problem within each neural clique graph. We maintain cross-model consistency by incorporating input noise during training. As such, NeuMeta may dynamically create arbitrary network parameters during the inference stage by sampling on the weight manifold. NeuMeta shows promising results in synthesizing parameters for unseen network configurations. Our extensive tests in image classification, semantic segmentation, and image generation reveal that NeuMeta sustains full-size performance even at a 75% compression rate.

Live content is unavailable. Log in and register to view live content