Lunchtime Seminar Series
Lunchtime Seminar Series
A series of talks presented by staff from the Department of Mathematics and Statistics, on topics of broad general interest. The talks will be aimed at the lower undergraduate level and should be accessible to anyone who has experience with first-year mathematics and statistics and an interest in seeing the wide range of possibilities the study of mathematics and statistics affords.
Contact: Justin Tzou (email@example.com)
|DETAILS||TITLE & ABSTRACT|
Tuesday 8th October:
Hugh Entwistle (Macquarie University)
When: Tuesday, 8th October 2019
Title: Mathematical staircases and where they lead us
Abstract: With the rise of computers and numerical approximations, it seems that our old friends – the fractions are neglected creatures. To the right music, with infinity – a soloist, fractions can delight and excite mathematicians and enthusiasts alike. The goal of this talk is to lead you through a journey of how one might begin experimenting with numbers and recursion to come across some pleasing and beautiful expressions on your own. We will first discuss the epidemic of false 1=2 ‘proofs’, before adding to the fire with one of our own with an infinite fraction taking centre stage – then playing around with creating these staircases of numbers to represent our favourite constants in mathematics and finally, for a fraction of the talk, to appreciate a beautiful connection to circles.
Session 2, 2019
Tuesday 1 September: Dr Lyndon Koens, "Everyday exotic emulsions"
Emulsions, or more broadly colloids, are mixtures of solids, liquids, and gases. These mixtures can form new states of matter and display unique behaviour. These properties make colloids of great interest to science and industry. In this talk, I will discuss the strange behaviour of some everyday colloids.
Tuesday 3 September: A/Prof Steve Lack, "16th century Italian martial arts"
Come along to hear about some Italian streetfighters, and the battles they waged using equations. We’ll also see a little of how their craft has evolved since then.
Tuesday 6 August: Dr Paul Bryan, "Counting the ways we count"
Humans invented counting so they we could tell who had the most of something (e.g. land, sheep, votes, money, …). Fractions follow fairly naturally when we need to share a cake, and much to the dismay of the Pythagorean cult, simple geometric figures (e.g. the diagonal of a square) cannot always be described in terms of fractions so something was missing.Then we realised that we also needed to know who didn’t have something (e.g. enough grain for the winter) so we invented negative numbers. Eventually we found zero to be a very useful concept to describe a state of equilibrium (just the right amount of grain for the winter).Then things started to get weird. To find the roots of a cubic equation, we discovered that we needed square roots of negative numbers. To solve the equations of electromagnetism (now governing our lives through electricity supply, smart phones, etc.) much to everyone’s astonishment, we needed three different square roots of minus one! And then there is infinity. One infinity is simply not enough, so now we have infinitely many infinities upon infinitley many infinites upon.
Session 1, 2019
Tuesday 21 May: Dr Hassan Doosti, "Introduction to Quantile Regression"
In this talk, we will briefly introduce quantile regression and some of its applications. Whereas the ordinary regression models provide a grand summary for the averages of the distributions corresponding to the set of predictors, quantile regression aims at estimating the conditional quantiles of the dependent variable. Robustness against outliers of the regressand, higher efficiency for a wide range of error distributions and no distribution assumptions are main advantages of quantile regression. In the following figure, ordinary mean regression (left) does not show significant changes. However, some quantile regressions (right) show clear patterns, particularly for larger quantiles.
Tuesday 7 May: Dr Richard Garner, "Deep Learning"
From beating 9th dan go players, to synthesising hellish dreamscapes, to recommending conspiracy theories on YouTube, is there nothing that modern artificial intelligence can't do? "Deep learning" is the hip modern nomenclature for what used to be called "neural networks"; and while many of the things deep learning does are indeed indistinguishable from magic, its mathematical underpinnings are laughably simple. The goal of this talk is to explain them to you.
Tuesday 2 April: Hugh Entwistle, "Convergence in the Central Limit Theorem"
The Central Limit Theorem is a famous and beautiful theorem in statistics however there are many things that yearn to be appreciated. Does the theorem work for all sums of random variables? Do these variables need to come from the same distribution? How many variables do we need for the theorem to work? I will first introduce, along with some initial conceptsin probability, what the Central Limit Theorem actually is as well as introducing the role that complex valued functions play in statistics. Finally I will lay the matter of convergence partially to rest by introducing the Berry-Esseen Inequality. The upper bound in this inequality is then studied in more detail before providing some applications
For past seminars, please go to our archive.