Structure Search for Normalizing Flows
Felix Gonsior
1
, Sascha Mücke
1
and Katharina Morik
1
1
Technische Universität Dortmund, August-Schmidt-Straße 1, 44227 Dortmund
Abstract
Normalizing Flows are deep generative models that allow for feasible exact inference by means of an
invertible mapping between a simple prior and an unknown data distribution. Coupling Flows inject
the expressive power of neural networks into this framework by allowing conditional transformations,
where the conditioner can be any nonlinear function. Under the assumption of feature locality, e.g. in
images, the conditional structure has been limited to locality preserving structures. We are interested
in cases where the locality assumption does not hold and propose a novel structure search approach
based on an evolutionary optimization scheme to fnd conditional structures. Our method can improve
convergence on non-image datasets and lead to smaller models.
Keywords
Normalizing Flow, Generative Model, Afne Coupling, Deep Learning, Evolutionary Algorithms
1. Introduction
Probabilistic inference is a central tool for many applications, like outlier detection [1], image
processing [2, 3], gap flling [4] and natural language processing [5], to name a few.
Normalizing fows are a family of probabilistic models that can be used for both inference as
well as data generation on continuous data. Their basic idea is to construct an invertible trans-
formation between a tractable (e.g., Gaussian) prior probability distribution and the unknown
data distribution. The result is a model that can transform (“fow”) data between the spaces
of prior and data distribution. Deep normalizing fows may be composed of a series of simple
transformations. A valid probability distribution is maintained by tracking the cumulative
volume diferentials throughout the fow. Data generation is performed by transforming samples
from the prior into the data space. The likelihood of points in data space can be inferred by
transforming them into the prior space. In an efort to exploit the high fexibility of deep neural
architectures in normalizing fows, coupling fows were introduced by Dinh et al. [6, 7]. A
coupling fow encapsulates an arbitrarily complex transformation in such a way that the result
is invertible.
Across literature, normalizing fows are most commonly used on image data, where features
exhibit strong locality properties: Pixels of an image correlate most strongly with neighboring
pixels. For this reason, coupling fows that are designed for e.g. image data ofen use locality
preserving masks to determine the conditioning structure. To the best of our knowledge,
LWDA’21: Lernen, Wissen, Daten, Analysen September 01–03, 2021, Munich, Germany
E felix.gonsior@tu-dortmund.de (F. Gonsior); sascha.muecke@tu-dortmund.de (S. Mücke);
katharina.morik@tu-dortmund.de (K. Morik)
G www.tu-dortmund.de (F. Gonsior); www.tu-dortmund.de (S. Mücke); www.tu-dortmund.de (K. Morik)
© 2021 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
CEUR
Workshop
Proceedings
http://ceur-ws.org
ISSN 1613-0073 CEUR Workshop Proceedings (CEUR-WS.org)