March 4, 2020 – HS11 (Hauptgebäude TU Wien)
1 pm: PhD Colloquium – Lukas Fertl: An introduction to sufficient dimension reduction, conditional variance estimator
Dimension reduction, especially sufficient dimension reduction (SDR), has gained importance in the last years. Modern SDR started around 1990 with the introduction of sliced inverse regression (SIR) and in this talk a informal introduction and overview into the topic will be given. Furthermore my research about Conditional Variance Estimation will be presented: CVE is a novel sufficient dimension reduction (SDR) method for regressions satisfying E(Y|X)=E(Y|B’X), where B’X is a lower dimensional projection of the predictors. CVE, similarly to its main competitor, the mean average variance estimation (MAVE), is not based on inverse regression, and does not require the restrictive linearity and constant variance conditions of moment based SDR methods.CVE is data-driven and applies to additive error regressions with continuous predictors and link function. The effectiveness and accuracy of CVE compared to MAVE and other SDR techniques is demonstrated in simulation studies.
Please find detailed information here.
January 15, 2020 – Sky Lounge, OMP1
3 pm: PhD Colloqium – Axel Böhm: How machines learn or minimizing training loss
December 4, 2019 – Sky Lounge, OMP1
3 pm: PhD Colloqium – Philipp Kniefacz: Sharp Sobolev Inequalities via Projection Averages
October 23, 2019 – Sky Lounge, OMP 1
2 pm: VSM Info Day
3 pm: PhD Colloquium – Alexander Pichler: An introduction to virtual element methods for the Helmholtz problem