Students of SAMBa will be at the forefront of a future generation of statistical applied mathematicians, with careers in both universities and industry.
Our research interests are broad and multidisciplinary. Researchers span a continuum from modelling of data with leading statistical methods, to investigating fundamental movements of particles, to applying mathematics to real world statistical phenomena.
Training in SAMBa provides students with exceptional skills in developing the formulation of statistical applied mathematics problems and with the tools to solve those problems. Students will have confidence in talking to people from a wide range of backgrounds and bringing new perspectives to challenges faced across industry and academia.
In order to address the modern challenges of analysing huge data sets and mapping them to real-time predictions, we believe it is essential that future generations of researchers are trained across the continuum of statistical applied mathematics, with confidence in computation, stochastics and a wide range of cross-disciplinary approaches.
There is also a need to work closely with industry and researchers from other disciplines in order to ensure that the benefits gained from this approach are widely shared and implemented.
The range of applications, and subsequent socio-economic impact is very broad: insurance risk, medical genetics, energy management, communication networks, pharmaceutical development, safety management of physical systems, ecological and population monitoring and retail analytics to name but a few.
Student PhD projects
Interacting particle models and the geometry of their macroscopic description, Marcus Kaiser
Marcus is studying interacting particle processes and their geometric description on the level of non-linear partial differential equations, such as drift-diffusive systems. He is looking at processes that can serve as prototypes for non-equilibrium behaviour, having underlying descriptions as irreversible Markov chains. The link between the microscopic description of the interacting particles processes to a macroscopic model will be established via hydrodynamic limits. A better understanding of the geometric behaviour of macroscopic models would gain insight into the way processes convergence to equilibrium. In particular this could yield a better sense of the rate at which processes converge. In contrast to the equilibrium situation (reversible case), it is not clear how and if concepts such as entropy and free energy can describe such systems.
This is an emerging field at the interface between mathematics and physics and progress in this direction could have implications on a wide range of thermodynamic non-equilibrium systems, including active matter. The techniques developed and results obtained can also lead to novel methods improving the rate of convergence of sampling algorithms such as Markov Chain Monte Carlo (MCMC) by e.g. introducing optimal non-gradient terms.
Modelling air pollution using data assimilation, Matt Thomas
In order to assess the burden of disease which may be attributable to air pollution, accurate estimates of exposure are required globally. There is a need for comprehensive integration of information from remote sensing, atmospheric models and surface monitoring to facilitate estimation of concentrations in areas throughout the world. Data assimilation is a method of combining model forecast data with observational data in order to more accurately understand the state of a system. Methods vary greatly in complexity and Matt will explore different methods from both a statistical and numerical analysis standpoint. Elements of a suitable method include flexibility, modularity, the ability to incorporate multiple levels of uncertainty and techniques that allow relationships between surface monitoring, remote sensing and atmospheric models that vary spatially and allow information to be `borrowed’ where monitoring data may be sparse. Throughout the project, the efficacy of different methods in this setting will be examined by applying them to data from the Global Burden of Disease project. Of particular interest is their scaleability with regards to use with high-dimensional data.
Faraday wave-droplet dynamics: a hydrodynamic quantum analogue, Matt Durey
Supervisor: Paul Milewski
It has been observed on a microscopic scale that when a small fluid droplet is dropped onto a vertically vibrating fluid surface, it will ‘walk’ across the surface of the bath. The droplet-Faraday pilot wave pair's behaviour is now reminiscent of quantum physics; there is a particle-wave duality where the fluid droplet can undergo similar processes to a particle in the quantum world. On an unbounded domain, pairs of droplets can interact, deflect or capture each other, depending on various parameters. The quantum single-particle double-slit experiment can be reproduced for fluid droplets, with the interactions between wave field and slits causing a diffraction probability distribution for droplet positions to be produced. This phenomenon is the basis for two lines of research that will be explored by Matt: (i) The fluid dynamics of droplet-Faraday pilot wave reflection properties at planar boundaries. (ii) The long time stationary behaviour of models for droplet-Faraday pilot wave dynamics in confined domains.
Compensated fragmentation processes, Dorka Fekete
Supervisor: Andreas Kyprianou
Compensated fragmentation processes are a new family of stochastic processes that model the pulsating creation and fragmentation of mass occurring at infinite rates. With just one existing paper in the literature, there are many possible directions and questions that can be asked of this type of processes. Dorka is interested in all number of growth and annihilation phenomena, fractal and scaling properties as well as their analysis and potential analysis as infinitely divisible processes.
Uncertainty Quantification for neutron transport problems, Matt Parkinson
Working in collaboration with Amec Foster Wheeler, Matt's PhD will develop computation of uncertainty in ﬂux and fundamental eigenvalue of a simpliﬁed 1D monoenergetic neutron transport problem with cross sections modelled by lognormal ﬁelds using KL sampling and Monte Carlo method. The methods will start with situations where the transport equation can be solved analytically and will go on to consider numerical solutions by discrete ordinates and then by analogue MC simulation. He will undertake analysis of how the MC error and KL truncation affect the results and associated numerical experiments and apply MLMC methods to the problem while assessing the possibility of applying multilevel techniques to the analogue MC solver for the simpliﬁed neutron transport problem.
Analysis of transition rates for the Dean-Kawasaki model, Federico Cornalba
Nucleation is a physical process, important in fields as diverse as physics, chemistry and biology. Nucleation is, broadly speaking, the process with which a material undergoes the formation of new thermodynamic phases via self-assembly. The mathematical description of this process is comprised of several different relevant features. In his Ph.D., Federico will focus his research on some aspects of the "Dean-Kawasaki" stochastic model, arising from the fluctuating hydrodynamics theory. Of this model, Federico will primarily investigate the underlying mathematical geometry, the transition rates analysis in the context of metastability, and will seek a description of the nucleation pathways.
Numerics and analysis of waves in random media, Owen Pembery
Wave propagation problems arise in applications such as seismic imaging, radar and ultrasound scanning. The Helmholtz equation is the simplest model of acoustic wave propagation, and approximating its solutions by the finite element method is a rich research area, as the matrices arising from its standard finite element discretisations are non-positive-definite. Therefore, designing quick solvers and providing robust analysis for the Helmholtz equation is challenging. In recent years, research has been carried out on the Helmholtz equation in heterogeneous media and the emerging next step is to study the Helmholtz equation in random media, where the speed of sound and other material properties are random fields; this is partly due to an increasing interest in Uncertainty Quantification for wave propagation problems.
Owen will be studying problems in wave propagation in random media, seeking to rigorously analyse properties of the solutions of the PDEs involved, and in doing so will answer problems relating to the approximation of solutions to these PDEs by numerical methods.
Higher-order DG methods for atmospheric modelling, Jack Betteridge
Supervisor: Eike Müller
One technique for solving partial differential equations numerically is by using the Discontinuous Galerkin (DG) method. This method has high spatial locality, which improves the parallel scalability and can take greater advantage of modern (many core) high performance computing architectures. A hybrid multigrid approach has already been successfully used for elliptic PDEs arising from subsurface flow. Similar methods can also be applied to atmospheric modelling problems, for instance solving the Navier-Stokes equations in a thin spherical shell. Over the course of the project Jack will look at the computational and algorithmic aspects of implementing a solver for these atmospheric models and will look at various different preconditioners to speed up the solution.
Modelling and optimised control of macro-parasitic diseases, Beth Boulton
Supervisor: Jane White
Macro-parasites cause a variety of diseases throughout the world, including many neglected tropical diseases. When considering mathematical models of macro-parasitic diseases, the SIS models so often used when modelling the spread of bacterial or viral diseases do not capture some of the crucial ways in which macro-parasitic diseases differ. By considering a combination of ODE models, probabilistic and, hybrid models, Beth will attempt to formulate mathematical models which capture the dynamics of host-parasite relationships and macro-parasitic infections and then make use of these to research how best to optimise the treatment of macro-parasitic infections in both people and animals.
Automatic diagnosis of psoriasis arthritis (xAPAD), Adwaye Rambojun
Patients with Psoriasis Arthritis are graded according to the extent of damage by scoring X-rays. Currently, this is a painstaking and time consuming process that has to be performed manually. In collaboration with the Bath Royal National Hospital of Rheumatic Diseases, Adwaye is working on automating this scoring process by exploring machine learning techniques from the computer vision community. He is working towards building a statistical model of a healthy hand that can be compared to diseased hand enabling the scoring process to be automated. This would enable scoring to be performed on a large scale basis that will ultimately increase the understanding of how the disease progresses within patients.
Condensation in reinforced branching processes, Anna Senkevich
Anna is going to study the non-extensive condensation phenomenon in reinforced branching processes, considering a system of immortal particles grouped into families according to their fitness parameter. These families produce offspring at a rate proportional to their size and fitness. The offspring inherit the same fitness with a set probability, or they are assigned a new fitness (according to some pre-set distribution) with another fixed probability. The main question to answer is how the system behaves asymptotically. In particular focussing on the behaviour of the largest and fittest families, and the conditions under which condensate is formed.
Sampling, model order reduction and multilevel techniques in Uncertainty Quantification problems, Gianluca Detommaso
Gianluca is working in the field of Uncertainty Quantification. This includes projects to find faster ways of sampling from posterior distributions in infinite dimensional problems, exploiting abstract concepts from information geometry to accelerate the convergence of sampling algorithms without suffering of the curse of dimensionality. These geometrical ideas can be merged for sampling with multilevel techniques to produce a rigorous mathematical study and a fast code applicable to several problems in Uncertainty Quantification. He is working on extending the Balance Truncation idea, introduced for deterministic control problems, to stochastic processes. Balance truncation consists in realizing model order reduction by keeping in consideration how much a state of the process is easy to be reached and how much the correspondent state of some linear quantity of interest is easy to be observed. Ideas for applying these techniques to Bayesian sampling are currently being considered. In addition, he is developing an extension of the Multilevel Monte Carlo (MLMC) method, addressed as Continuous Level Monte Carlo. It consists of substituting the discrete levels of approximations in the MLMC with a continuous sequence of levels. By approximating the resulting integral and adding correction terms, he gets a wider class of methods satisfying good properties, where MLMC is a special case.
Seamless and overarching approaches for optimising over the phases of drug development, Robbie Peck
This project concerns the optimisation of the drug development process at a program level. This involves considering multiple phases of treatment refinement and dose selection together. While individual phases of drug development have been studied in depth, there has been relatively little work that looks at two or more phases jointly. Robbie’s project uses numerical computations and simulations to model different designs which may involve computational challenges including trial designs which use a form of gain function, or “net present value”, in order to optimise decision making throughout phases, use of Seamless Phase II/III designs that may use data from Phase II in the final analysis, possibly through use of a combination test, and the realistic incorporation of beliefs about drug safety and tolerability into the program level decision making process.
Modelling the surge phenomenon within turbomachinery, Kate Powers
Turbochargers are used in internal combustion engines in order to get a better power output for smaller engines and to get better fuel efficiency. Turbochargers work by compressing air. In order to get the most out of a turbocharger the air before and after the compressor needs a high pressure ratio for a relatively low massflow. If the massflow is too low, the air flow can reverse direction and cause surge. Surge is a difficult phenomenon to model because it exhibits chaotic behaviour. Kate is working jointly with the Mechanical Engineering department with the aim of finding a model that can (i) give a better prediction of the onset of surge and (ii) describe what happens to the air flow during surge. This will involve analysis of experimental data as well as a combination of theory from compressible fluid dynamics, rotating flows, dynamical systems and bifurcations.
Constrained optimal stopping problems, Ben Robinson
Supervisor: Alex Cox
Ben is studying optimal stopping problems with constraints on the expected stopping time. These problems have been considered in a few recent papers and are motivated in part by problems in mathematical finance. One approach to such a problem is to introduce an auxiliary process, which can be chosen under some constraints, in order to reformulate the problem as a stochastic optimal control problem. Ben is looking at the solutions to stochastic control problems of this type and exploring their relationship to Monge-Ampère type partial differential equations. Monge-Ampère type equations arise in several areas of mathematics, including the study of optimal transport. Ben is also investigating connections between stochastic optimal control problems and martingale optimal transport.
Attribution of large scale drivers for environmental change, Aoibheann Brady
Several large flood events have hit the UK in the last years, and there is a growing concern among the public opinion and policy makers on whether the current level of protection of cities and infrastructure is appropriate. In particular, there is a concern that climate change and its impacts might result in increased flood risks: climate change projections seem to indicate that flooding risk might increase, but this is not fully validated by the observed river flow data, for which there is no strong evidence of increasing trends. Further, due to the short period of river flow record, the testing methods routinely used to assess whether change can be detected in observed data are typically not very powerful (in a statistical sense) and can not fully differentiate between possible confounders. We aim to develop methods to detect and attribute changes in flooding and other environmental variables. This will result in methods for the detection of spatially coherent trends in environmental data. The project will also investigate methods to make an assessment on the main drivers of higher river flows and flooding at a regional or national scale.
Mixing times and general behaviour of random walks on changing environments, Andrea Lelli
Supervisor: Alexandre Stauffer
Random walks in random environments have become a classical model for random motion in random media, and this model has been the source of many mathematical investigations over the years. More recently, people started to look at random walks in an environment which changes at the same time that the particle is moving. It is believed that when the environment is ‘well behaved’ (e.g. uniformly elliptic) and changes quickly enough, the random walk will behave in a way that is similar to a random walk on the underlying (non-changing) graph. This has been quantiﬁed, especially in the case of the d-dimensional infinite lattice, by the derivation of a law of large numbers and central limit theorems under some conditions related to the mixing time of the environment. Andrea is interested in understanding the effect of a slowly changing environment on the behaviour of simple random walks, e.g. the impact of the environment on the recurrence/transience property of the random walk and the mixing time of the random walk inside a ﬁnite, but changing graph.
Two-species contact processes, Sam Moore
Recent work in the physics literature has explored the ‘two-species contact process’ as a model of staged infections. The work has a biological interpretation in terms of host-parasite invasions, for example, when a growing colony of bacteria is under threat from a developing bacteriophage infection. Past studies have focused mainly on simulations on Z2. Sam is interested in exploring the possibility of obtaining mathematically rigorous results for models of this type but evolving on random graphs. He aims to further make use of existing branching methods as a novel approach to the problem.
Mathematical method for brain research, Shaerdan Shataer
Supervisor: Chris Budd
Brain M/EEG source reconstruction is a mathematical inverse problem, improvement on the mathematical method is critical to application in neuroscience and clinical research. Novel mathematical research on this topic is to combine fMRI with M/EEG to achieve higher spatial and temporal resolution. Shaerdan’s focus is on the improvement of existing regularisation methods, especially the challenge of developing a new adapted regularisation to best fit fMRI guided reconstruction.