Hi everyone, You are all cordially invited to the AMLab Seminar on **Thursday 29th October at 2:00 p.m CET** on **Zoom**, where **Andy Keller** will give a talk titled “**Self Normalizing Flows** “. (*Note that the time slot for this talk is modified a bit, 2 hours advanced than previous ones, it will be appreciated if you can save this in your calendar*.)

**Title: **Self Normalizing Flows

**Abstract:** Efficient gradient computation of the Jacobian determinant term is a core problem of the normalizing flow framework. Thus, most proposed flow models either restrict to a function class with easy evaluation of the Jacobian determinant, or an efficient estimator thereof. However, these restrictions limit the performance of such density models, frequently requiring significant depth to reach desired performance levels. In this work, we propose Self Normalizing Flows, a flexible framework for training normalizing flows by replacing expensive terms in the gradient by learned approximate inverses at each layer. This reduces the computational complexity of each layer’s exact update from O(D^3) to O(D^2), allowing for the training of flow architectures which were otherwise computationally infeasible, while also providing efficient sampling. We show experimentally that such models are remarkably stable and optimize to similar data likelihood values as their exact gradient counterparts, while surpassing the performance of their functionally constrained counterparts.

To gain more deep insights into this recently developed normalizing flow, feel free to join and discuss it 🙂 !