Skip to main content
SHARE
Publication

Dynamic Low-Rank Training with Spectral Regularization: Achieving Robustness in Compressed Representations

by Steffen Schotthoefer, Hsiuhan Yang, Stefan R Schnake
Publication Type
Conference Paper
Book Title
Methods and Opportunities at Small Scale Workshop @ ICML 2025
Publication Date
Page Numbers
1 to 14
Publisher Location
District of Columbia, United States of America
Conference Name
Workshop: Methods and Opportunities at Small Scale (MOSS) @ ICML 2025
Conference Location
Vancouver, Canada
Conference Sponsor
ICML
Conference Date
-

Deployment of neural networks on resource-constrained devices demands models that are both compact and robust to adversarial inputs. However, compression and adversarial robustness often conflict. In this work, we introduce a dynamical low-rank training scheme enhanced with a novel spectral regularizer that controls the condition number of the low-rank core in each layer. This approach mitigates the sensitivity of compressed models to adversarial perturbations without sacrificing clean accuracy. The method is model- and data-agnostic, computationally efficient, and supports rank adaptivity to automatically compress the network at hand. Extensive experiments across standard architectures, datasets, and adversarial attacks show the regularized networks can achieve over 94\% compression while recovering or improving adversarial accuracy relative to uncompressed baselines.