9.15 Benny Avelin: Generalisation estimates of deep neural networks

\n\nAbstract: Neural networks is a specific type of regre ssion functions arising in statistical learning\, that has gained widesp read attention in real world applications\, for instance image analysis and control problems. Little is however known about this class of regres sion functions\, and classical VC theory fails to explain why they still generalize in the over-parametrized regime. In this talk we will cover known generalization estimates and how we attempt to improve them by con sidering the deep limit of neural networks\, as well as some recent resu lts in this direction.

\n\n\;

\n\n10.15 Hiba Nassa: Emp irically driven orthonormal bases for functional data analysis

\n\nAbstract: There is a strong relation between high-dimensional data and functional data. One can convert the densely observed high-dimensional d ata to functional data by defining a set of functional bases\, then set up the coefficients to define the functional data as a linear combinatio n of these bases. Typically\, the choice of the bases is not data-driven with a notable exception to the number of dimensions\, that often can b e derived by cross-validations. As a consequence\, several standard base s such as Fourier and related bases\, wavelets\, splines etc. are typica lly used to transform observed functional data. Through such a prior and rather arbitrary decision on the basis selection the problem is transfo rmed to a finite-dimensional space of basis coefficients and formally is losing its infinite dimensional character.

\n\nWe propose a stric tly data-driven method of basis selection. Since the method is algorithm ic and searches the data to find an effective representation of the basi s by minimizing overall mean square error across functional samples\, th e functional basis is strictly tied to the functional character of the d ata and loses arbitrariness of common approaches. The method itself uses B-splines and O-splines in the machine learning style of functional dat a mining to find efficiently placed knots. Due to machine learning chara cter of data processing\, the method has the potential to further numeri cally improve and extend beyond the considered scope.

\n\n\;< /p>\n\n

11.15 Liam Solus: Discrete Geometry in Model Discovery

\n\nAbstract: In the today'\;s world\, where data is an abundant resou rce\, there is a strong interest in data-driven algorithms for model dis covery that are both efficient and reliable. Of particular interest are such algorithms for learning probabilistic\, or even causal\, DAG models . Historically\, the combinatorics of graphs has played a central role i n the development of DAG model learning algorithms. However\, ideas from contemporary geometric and algebraic combinatorics were previously unin volved. In this talk\, we will discuss a new method for DAG model discov ery. This method arises as a simplex-type algorithm over convex polytope s known as generalized permutohedra\, which are central to the field of algebraic and geometric combinatorics. \; We will see that\, when co mpared with the state-of-the-art\, these methods perform competitively\, are provably more reliable\, and even shed some restricting parametric assumptions.

\n DESCRIPTION: SUMMARY:Seminarier i statistik LOCATION:Ångströmlaboratoriet\, 4004 TZID:Europe/Stockholm DTSTART:20190603T091500 DTEND:20190603T120000 UID:20190603T091500-45552@uu.se END:VEVENT END:VCALENDAR