January 15, 2020 – Sky Lounge, OMP1
3 pm: PhD Colloqium – Axel Böhm:
How machines learn
or minimizing training loss
We will analyze how deep deep learning really is; from the point of view of optimization. I will give a gentle introduction into neural networks, linear regression, stochastic gradient descent (SGD) and its variants – most notably ADAM – backpropagation and other mysterious objects from Machine Learning. After this we’ll make our way to Generative Adversarial Networks (GAN) – not only because they have been called the most interesting idea in Machine Learning in the last decade but also due to the fact that they are intriguing from the standpoint of optimization.
Further details can be found here.
December 4, 2019 – Sky Lounge, OMP1
3 pm: PhD Colloqium – Philipp Kniefacz: Sharp Sobolev Inequalities via Projection Averages
October 23, 2019 – Sky Lounge, OMP 1
2 pm: VSM Info Day
3 pm: PhD Colloquium – Alexander Pichler: An introduction to virtual element methods for the Helmholtz problem