You are all cordially invited to the AMLab seminar on Tuesday October 3 at 16:00 in C3.163, where Christos Louizos will give a talk titled “Bayesian Uncertainty and Compression for Deep Learning”. Afterwards there are the usual drinks and snacks!
Deep Learning has shown considerable success in a wide range of domains due its rich parametric form and natural scalability to big datasets. Nevertheless, it has limitations that prevent its adoption in specific problems. It has been shown in recent works that they suffer from over-parametrization as they can be significantly pruned without any loss in performance. This fact essentially shows that there is a lot of wasteful computation and resources, which can lead to large speedups if it is avoided. Furthermore, current neural networks suffer from unreliable uncertainty estimates that prevent their usage in domains that involve critical decision making and safety.
In this talk we will show how these two relatively distinct problems can be addressed under a common framework that involves Bayesian inference. In particular, we will show that by adopting a more elaborate version of Gaussian dropout we can obtain deep learning models that can have robust uncertainty on a variety of tasks and architectures, while simultaneously providing compressed networks where most of the parameters and computation has been removed.