Monthly Archives: November 2017

Talk by Thomas Kipf

You are all cordially invited to the AMLab seminar on Tuesday November 14 at 16:00 in C3.163, where Thomas Kipf will give a talk titled “End-to-end learning on graphs with graph convolutional networks”. Afterwards there are the usual drinks and snacks!

Abstract: Neural networks on graphs have gained renewed interest in the machine learning community. Recent results have shown that end-to-end trainable neural network models that operate directly on graphs can challenge well-established classical approaches, such as kernel-based methods or methods that rely on graph embeddings (e.g. DeepWalk). In this talk, I will motivate such an approach from an analogy to traditional convolutional neural networks and introduce our recent variant of graph convolutional networks (GCNs) that achieves promising results on a number of semi-supervised node classification tasks. If time permits, I will further introduce two extensions of this basic framework, namely: graph auto-encoders and relational GCNs. While graph auto-encoders provide a novel way of approaching problems like link prediction or recommendation, relational GCNs allow for efficient modeling of directed relational graphs, such as knowledge bases.

Talk by Matthias Reisser

You are all cordially invited to the AMLab seminar on Tuesday November 7 at 16:00 in C3.163, where Matthias Reisser will give a talk titled “Failure Modes of distributed Variational Inference”. Afterwards there are the usual drinks and snacks!

Abstract: In this talk I want to give a summary over thoughts and experiments we performed over the last couple of weeks in trying to develop a distributed Variational Inference algorithm. Although, theoretically, we can see advantages to the proposed model, as well as cannot immediately see theoretical reasons why it should not work, the experiments demonstrate that learning in the proposed algorithm is unstable and fails catastrophically in the tested settings. I would like to show our intuition and would be glad to discuss and collect your ideas.