Accelerating Neural ODEs using Model Order Reduction

Beskrivning

Embedding nonlinear dynamical systems into artificial neural networks is a powerful new formalism for machine learning. By parameterizing ordinary differential equations (ODEs) as neural network layers, these Neural ODEs are memory-efficient to train, process time-series naturally and incorporate knowledge of physical systems into deep learning models. However, the practical applications of Neural ODEs are limited due to long inference times, because the outputs of the embedded ODE layers are computed numerically with differential equation solvers that can be computationally demanding. Here we show that mathematical model order reduction methods can be used for compressing and accelerating Neural ODEs by accurately simulating the continuous nonlinear dynamics in low-dimensional subspaces. We implement our novel compression method by developing Neural ODEs that integrate the necessary subspace-projection and interpolation operations as layers of the neural network. We validate our approach by comparing it to neuron pruning and SVD-based weight truncation methods from the literature in image and time-series classification tasks. The methods are evaluated by acceleration versus accuracy when adjusting the level of compression. On this spectrum, we achieve a favourable balance over existing methods by using model order reduction when compressing a convolutional Neural ODE. In compressing a recurrent Neural ODE, SVD-based weight truncation yields good performance. Based on our results, our integration of model order reduction with Neural ODEs can facilitate efficient, dynamical system-driven deep learning in resource-constrained applications.
Visa mer

Publiceringsår

2022

Typ av data

Upphovspersoner

Code Ocean - Utgivare

Lassi Paunonen - Upphovsperson

Marja-Leena Linne - Upphovsperson

Mikko Lehtimäki - Upphovsperson

Projekt

Övriga uppgifter

Vetenskapsområden

Data- och informationsvetenskap

Språk

engelska

Öppen tillgång

Öppet

Licens

Other

Nyckelord

Neural Networks, Mathematics, FOS: Mathematics, Capsule, Model Order Reduction

Ämnesord

Temporal täckning

undefined

Relaterade till denna forskningsdata