Skip to main content
University of Bath

Student projects in the Centre for Doctoral Training in Statistical Applied Mathematics (SAMBa)

Find out about current and past projects from our PhD students.

A random graph
In SAMBa, you'll undertake rigorous research in a world-class environment, becoming a leading problem solver able to work flexibly across disciplines.

Students starting in 2014

Faraday wave-droplet dynamics: a hydrodynamic quantum analogue, Matt Durey

Supervisor: Paul Milewski

It has been observed on a microscopic scale that when a small fluid droplet is dropped onto a vertically vibrating fluid surface, it will `walk' across the surface of the bath. The droplet-Faraday pilot wave pair's behaviour is now reminiscent of quantum physics; there is a particle-wave duality where the fluid droplet can undergo similar processes to a particle in the quantum world. On an unbounded domain, pairs of droplets can interact, deflect or capture each other, depending on various parameters. The quantum single-particle double-slit experiment can be reproduced for fluid droplets, with the interactions between wave field and slits causing a diffraction probability distribution for droplet positions to be produced. This phenomenon is the basis for two lines of research that is being explored by Matt: (i) The fluid dynamics of droplet-Faraday pilot wave reflection properties at planar boundaries. (ii) The long time stationary behaviour of models for droplet-Faraday pilot wave dynamics in confined domains.

SDEs for embedded successful genealogies, Dorka Fekete

Supervisors: Andreas Kyprianou

Dorka is using the mathematical medium of stochastic differential equations (SDEs) to describe the fitness of certain sub-populations in an asexual high-density stochastic population model known as a continuous-state branching process. In particular, she is looking at ways to describe genealogies that propagate prolific traits in surviving populations, where ‘survival’ can be interpreted in different ways. For example, it can mean survival beyond a certain time-horizon, but it can also mean survival according to some spatial criteria.

Interacting particle models and the geometry of their macroscopic description, Marcus Kaiser

Supervisors: Johannes Zimmer and Rob Jack

Marcus is studying the geometric properties of interacting particle systems and their hydrodynamic scaling limits described by non-linear partial differential equations, such as drift-diffusive systems. He is looking at processes that can serve as prototypes for non-equilibrium behaviour, having underlying descriptions as irreversible Markov chains. A better understanding of the geometric behaviour and the links between the microscopic and macroscopic models yields new insights, such as the way processes converge to equilibrium. See http://people.bath.ac.uk/mk806/ for more details.

Uncertainty Quantification for neutron transport problems, Matt Parkinson

Supervisors: Ivan Graham, Rob Scheichl and Paul Smith

Working in collaboration with Wood plc, Matt's PhD is developing computation of uncertainty in flux and fundamental eigenvalue of a simplified 1D monoenergetic neutron transport problem with cross sections modelled by lognormal fields using KL sampling and Monte Carlo method. The methods start with situations where the transport equation can be solved analytically and go on to consider numerical solutions by discrete ordinates and then by analogue MC simulation. He is analysing how the MC error and KL truncation affect the results and associated numerical experiments and apply MLMC methods to the problem while assessing the possibility of applying multilevel techniques to the analogue MC solver for the simplified neutron transport problem.

Modelling air pollution using data assimilation, Matt Thomas

Supervisors: Gavin Shaddick and Melina Freitag

In order to assess the burden of disease which may be attributable to air pollution, accurate estimates of exposure are required globally. There is a need for comprehensive integration of information from remote sensing, atmospheric models and surface monitoring to facilitate estimation of concentrations in areas throughout the world. Data assimilation is a method of combining model forecast data with observational data in order to more accurately understand the state of a system. Methods vary greatly in complexity and Matt is exploring different methods from both a statistical and numerical analysis standpoint. Elements of a suitable method include flexibility, modularity, the ability to incorporate multiple levels of uncertainty and techniques that allow relationships between surface monitoring, remote sensing and atmospheric models that vary spatially and allow information to be `borrowed' where monitoring data may be sparse. Throughout the project, the efficacy of different methods in this setting is being examined by applying them to data from the Global Burden of Disease project. Of particular interest is their scaleability with regards to use with high-dimensional data.

Students starting in 2015

High-order DG methods for atmospheric modelling, Jack Betteridge

Supervisors: Eike Müller and Ivan Graham

One technique for solving partial differential equations numerically is by using the Discontinuous Galerkin (DG) method. This method has high spatial locality, which improves the parallel scalability and can take greater advantage of modern (many core) high performance computing architectures. A hybrid multigrid approach has already been successfully used for elliptic PDEs arising from subsurface flow. Similar methods can also be applied to atmospheric modelling problems, for instance solving the Navier-Stokes equations in a thin spherical shell. Over the course of the project, Jack is looking at the computational and algorithmic aspects of implementing a solver for these atmospheric models and the various different pre-conditioners to speed up the solution.

Modelling and optimised control of macro-parasitic diseases, Beth Boulton

Supervisor: Jane White

Macro-parasites cause a variety of diseases throughout the world, including many neglected tropical diseases. When considering mathematical models of macro-parasitic diseases, the SIS models so often used when modelling the spread of bacterial or viral diseases do not capture some of the crucial ways in which macro-parasitic diseases differ. By considering a combination of ODE models, probabilistic and, hybrid models, Beth will attempt to formulate mathematical models which capture the dynamics of host-parasite relationships and macro-parasitic infections and then make use of these to research how best to optimise the treatment of macro-parasitic infections in both people and animals.

Attribution of large scale drivers for environmental change, Aoibheann Brady

Supervisors: Ilaria Prosdocimi and Julian Faraway

Several large flood events have hit the UK in the last years, and there is a growing concern among the public opinion and policy makers on whether the current level of protection of cities and infrastructure is appropriate. In particular, there is a concern that climate change and its impacts might result in increased flood risks: climate change projections seem to indicate that flooding risk might increase, but this is not fully validated by the observed river flow data, for which there is no strong evidence of increasing trends. Further, due to the short period of river flow record, the testing methods routinely used to assess whether change can be detected in observed data are typically not very powerful (in a statistical sense) and can not fully differentiate between possible confounders. Aoibheann is aiming to develop methods to detect and attribute changes in flooding and other environmental variables. This will result in methods for the detection of spatially coherent trends in environmental data. The project is also investigating methods to make an assessment on the main drivers of higher river flows and flooding at a regional or national scale.

Analysis of transition rates for the Dean-Kawasaki model, Federico Cornalba

Supervisors: Johannes Zimmer and Tony Shardlow

Nucleation is a physical process, important in fields as diverse as physics, chemistry and biology. Nucleation is, broadly speaking, the process with which a material undergoes the formation of new thermodynamic phases via self-assembly. The mathematical description of this process is comprised of several different relevant features. In his PhD, Federico is focusing his research on some aspects of the Dean-Kawasaki stochastic model, arising from the fluctuating hydrodynamics theory. Of this model, Federico is primarily investigating the underlying mathematical geometry, the transition rates analysis in the context of metastability, and will seek a description of the nucleation pathways.

Accelerating Bayesian sampling, Gianluca Detommaso

Supervisor: Rob Scheichl

Gianluca's research aims to bring together techniques from statistics, numerical analysis and applied mathematics to accelerate Bayesian sampling. In particular, he deals with computationally expensive high-dimensional problems, trying to beat down the cost per iteration and performing algorithms that scale well in high-dimension. Gianluca is interested in developing interactions among different research fields, bringing together knowledge and experimenting with new ideas. He also tries out new potential sampling accelerations, or applies his machinery to other topics. His current research involves multilevel methods, MCMC algorithms, transport maps and Bayesian inverse problems.

Mixing times and general behaviour of random walks on changing environments, Andrea Lelli

Supervisor: Alexandre Stauffer

Random walks in random environments have become a classical model for random motion in random media, and this model has been the source of many mathematical investigations over the years. More recently, people started to look at random walks in an environment which changes at the same time that the particle is moving. It is believed that when the environment is ‘well behaved’ (e.g. uniformly elliptic) and changes quickly enough, the random walk will behave in a way that is similar to a random walk on the underlying (non-changing) graph. This has been quantified, especially in the case of the d-dimensional infinite lattice, by the derivation of a law of large numbers and central limit theorems under some conditions related to the mixing time of the environment. Andrea is interested in understanding the effect of a slowly changing environment on the behaviour of simple random walks, e.g. the impact of the environment on the recurrence/transience property of the random walk and the mixing time of the random walk inside a finite, but changing graph.

Two-species contact processes, Sam Moore

Supervisors: Tim Rogers and Peter Mörters

Recent work in the physics literature has explored the ‘two-species contact process’ as a model of staged infections. The work has a biological interpretation in terms of host-parasite invasions, for example, when a growing colony of bacteria is under threat from a developing bacteriophage infection. Past studies have focused mainly on simulations on Z2. Sam is interested in exploring the possibility of obtaining mathematically rigorous results for models of this type but evolving on random graphs. He aims to further make use of existing branching methods as a novel approach to the problem.

Seamless and overarching approaches for optimising over the phases of drug development, Robbie Peck

Supervisors: Chris Jennison and Alun Bedding

This project in collaboration with Roche concerns the optimisation of the drug development process at a program level. This involves considering multiple phases of treatment refinement and dose selection together. While individual phases of drug development have been studied in depth, there has been relatively little work that looks at two or more phases jointly. Robbie’s project uses numerical computations and simulations to model different designs which may involve computational challenges including trial designs which use a form of gain function, or “net present value”, in order to optimise decision making throughout phases, use of Seamless Phase II/III designs that may use data from Phase II in the final analysis, possibly through use of a combination test, and the realistic incorporation of beliefs about drug safety and tolerability into the program level decision making process.

Numerics and analysis of waves in random media, Owen Pembery

Supervisors: Euan Spence and Ivan Graham

Wave propagation problems arise in applications such as seismic imaging, radar and ultrasound scanning. The Helmholtz equation is the simplest model of acoustic wave propagation - solutions of the Helmholtz equation correspond to acoustic waves with a single frequency. Researchers have been studying the Helmholtz equation, and developing numerical methods to solve it, for many years. However, most of the research effort until now has been concerned with sound waves propagating through a homogeneous medium where the speed of sound is constant. Owen is studying the Helmholtz equation where the medium is heterogeneous or random. He is developing numerical methods for uncertainty quantification for it and proving rigorous mathematical results about solutions. These results will allow him to study the convergence behaviour of these numerical methods, and may suggest new numerical methods as well.

Modelling the surge phenomenon within turbomachinery, Kate Powers

Supervisors: Paul Milewski, Chris Brace, Colin Copeland and Chris Budd

Turbochargers are used in internal combustion engines in order to get a better power output for smaller engines and to get better fuel efficiency. Turbochargers work by compressing air. In order to get the most out of a turbocharger the air before and after the compressor needs a high pressure ratio for a relatively low massflow. If the massflow is too low, the air flow can reverse direction and cause surge. Surge is a difficult phenomenon to model because it exhibits chaotic behaviour. Kate is working jointly with the Mechanical Engineering department with the aim of finding a model that can (i) give a better prediction of the onset of surge and (ii) describe what happens to the air flow during surge. This will involve analysis of experimental data as well as a combination of theory from compressible fluid dynamics, rotating flows, dynamical systems and bifurcations.

Automatic diagnosis of psoriasis arthritis (xAPAD), Adwaye Rambojun

Supervisors: Neill Campbell, Tony Shardlow, Gavin Shaddick and Will Tillett

Patients with Psoriasis Arthritis are graded according to the extent of damage by scoring X-rays. Currently, this is a painstaking and time consuming process that has to be performed manually. In collaboration with the Bath Royal National Hospital of Rheumatic Diseases, Adwaye is working on automating this scoring process by exploring machine learning techniques from the computer vision community. He is working towards building a statistical model of a healthy hand that can be compared to diseased hand enabling the scoring process to be automated. This would enable scoring to be performed on a large scale basis that will ultimately increase the understanding of how the disease progresses within patients.

Topics in optimal stopping and optimal transport, Ben Robinson

Supervisor: Alex Cox

Ben is studying various stochastic optimisation problems and the connections between them. Recent work on optimal stopping problems has investigated imposing a constraint on the expected value of the stopping time in these problems to obtain so-called constrained optimal stopping problems. Ben plans to build on this work, making use of a connection to stochastic optimal control problems. This approach requires developing an understanding of the modern theory of stochastic optimal control, including the theory of weak solutions to partial differential equations in the viscosity sense. Certain problems of this type can be represented in terms of Monge-Ampère equations, a highly non-linear class of PDEs, which arise in the classical Monge-Kantorovich optimal transport problem. Ben is interested in this problem, as well as the recent variation, martingale optimal transport, in which additional constraints are imposed. Methods of martingale optimal transport have also been used in the Skorokhod embedding problem, a classical problem in probability theory. Each of these classes of problems has a financial motivation. Ben is particularly interested in how these problems are related.

Condensation in reinforced branching processes with fitness, Anna Senkevich

Supervisors: Peter Mörters and Cécile Mailler

Anna is studying a stochastic model for evolution of a structured population of particles equipped with fitness values. Each particle reproduces independently, with rate given by its fitness, and its offspring either inherits the fitness with some probability, or gets a new fitness value drawn from some probability distribution, independent of everything else. The particles of the same fitness are referred to as families. This is a stochastic version of Kingman’s model for population undergoing selection and mutation. However this framework also covers a dynamic random graph model, preferential attachment tree with fitness of Bianconi and Barabási, which is suitable for describing growth characteristics of real-life networks, such as social networks. There are two growth scenarios of the system: growth driven by bulk behaviour and growth driven by extremal behaviour (condensation case). Furthermore, there are two types of condensation: non-extensive, when no individual family makes an asymptotically positive contribution to the population, and macroscopic, when proportion of individuals in the largest family is asymptotically positive. Behaviour of the system is largely determined by properties of the chosen probability distribution. So far a broad class of bounded fitness distributions with polynomial behaviour at the tail was analysed. In this project, Anna is focusing on asymptotic behaviour of maximal families for bounded fitness distributions with a faster decay at the maximal fitness value. She is going to establish which of the above scenarios prevails by drawing links with extreme value theory.

Distributed optimisation of LTE systems, Amy Middleton

Supervisors: Antal Járai, Jon Dawes and Keith Briggs

The aim of Amy's project is to look more fundamentally at the mathematics of self-optimising networks; in particular to set up and analyse precise dynamical models in order to gain information about fundamental limits of what can be achieved when system optimisation has to be performed with incomplete information. Working in collaboration with BT, Amy's project will develop ways in which existing theory in diverse fields such as information theory, discrete-time dynamical systems, stochastic processes, optimisation, and others can be brought together to solve complex mathematical problems.

Inverse problems for brain imaging, Shaerdan Shataer

Supervisor: Chris Budd

Imaging is a fast growing area driven by its importance in real life application as well as its mathematical challenge. In the field of brain research, imaging brain activity serves as part of the ambition to understand some fundamental questions about cognition and perception. Mathematically, the problem could be perceived as two levels of the inverse problem: first to solve the source intensity image from the scalp measurement, second to infer the cause of source activity from source intensity image solved from the first part. Shaerdan is aiming to locate the active sources of brainwaves, given measurements of EEG on the surface of the scalp.

Bayesian statistical modelling for quantitative risk analysis, Sebastian Stolze

Supervisors: Finn Lindgren, Evangelos Evangelou and David Worthington

Sebastian is studying extensions and innovative uses of Bayesian Networks (BNs) as a tool for Quantitative Risk Analysis (QRA). QRA is especially relevant for the oil and gas industry, where analysis is usually carried out probabilistically in order to assess likelihood and impact of safety issues. A common framework to represent results from such analyses are Event Trees (ETs) which lack many properties for dynamic risk assessment. Working in collaboration with DNV GL, the focus of this project is to study how ETs can be cast into BNs in a practical way using information measures that allow for simplifications of BNs. Furthermore, particular time-continuous extensions for BN modelling are considered that allow examination of time-to-event variables in more detail.

Students starting in 2016

Spatial confounding, Emiko Dupont

Supervisor: Nicole Augustin

Spatial confounding is a problem that often occurs in environmental, ecological and epidemiological applications of spatial statistics. Models for spatial data usually include a fixed effect for the explanatory variable of interest as well as a random effect capturing spatial correlation in the data. Although the inclusion of a spatial random effect generally improves the goodness of fit of the model, it can also introduce bias in the estimated fixed effect due to co-linearity of the fixed and random effects, which could lead to incorrect statistical inference. This is called spatial confounding and is a general problem that is not restricted to any specific type of statistical model. Emiko’s project is about gaining a better understanding of spatial confounding, using both real and simulated data to investigate when the problem occurs and what can be done to avoid it. She is considering both parametric and non-parametric spatial models.

Discordant voting on evolving scale-free networks, John Fernley

Supervisors: Marcel Ortgiese and Peter Mörters

Similarly to the Contact Process, voting models describe competing spread of two ‘opinions’ on a graph of interacting ‘voters’. Cooper et al. in their 2016 paper, “discordant voting processes on finite graphs”, explored the expected consensus time for a variety of voting models on extremal graphs. These discordant voting models could be seen as a bridge between the classical voter model and the Graph Fission evolving voter model of Durrett. John is interested in finding a universal description of the model's lifetime on scale-free heterogeneous networks, in particular with Chung-Lu type edge models. These models can then be made to evolve in time by vertex updating, and his next objective would be to show that this speeds consensus.

Methods for preferentially sampled spatial data, Elizabeth Gray

Supervisor: Evangelos Evangelou

In general, geostatistical methods deal with data under the assumption that the quantity being measured is independent of the locations at which measurements are being taken. However, this is often not the case. Preferential sampling refers to the situation in which there is some stochastic dependence between the quantity being measured and the process used to select the sampling locations, involving an investigator’s ‘design utility’. Ignoring such a dependence can lead to biased and inaccurate estimates. Elizabeth’s PhD involves investigating and developing methods for modelling such data.

Bayesian inference for point processes, Nadeen Khaleel

Supervisor: Theresa Smith

Point patterns, specifically spatial and spatio-temporal point patterns, occur frequently in the environment sciences and epidemiology. These phenomena are possible to model using point processes from which it is possible to learn about any spatial relationships that cause the point pattern observed as well as stochastic dependence between points in the pattern. In particular, Cox processes (or “doubly stochastic” processes) are practical models when the point pattern is clustering due to environmental heterogeneity that is stochastic. Nadeen is working on computational methods for a particular type of Cox process, log-Gaussian Cox processes where she is exploring the development of efficient MCMC techniques for fitting large scale spatio-temporal point patterns and comparing the effects of predictors in different regions.

Averaging for fast-slow systems, Matthias Klar

Supervisors: Johannes Zimmer and Karsten Matthies

Matthias is studying systems with multiple time scales, so-called fast-slow systems. One aim is to derive effective large-scale descriptions of such systems, by `averaging out' the fast scale. Thermodynamic systems are prototypical examples of systems with such a separation of time scales, and the aim of this project is to advance averaging methods for thermodynamic models.

Detection of underwater acoustic events in a large dataset with machine learning, Amélie Klein

Supervisors: Philippe Blondel and Kari Heine

Acoustic remote sensing listens to ambient noise underwater and uses it to recognise the sources of the sounds (e.g. marine life, human activities, weather). Passive sensors acquire data at very high rates (up to a million samples/second) for long periods (up to several years). In this project, Amélie is working on automating the processing and exploration of the large dataset using machine learning techniques and high-performance computing system. The project aims to detect long-term trends, like the increase in shipping or seasonal variations in marine life, and transient events, loud sounds associated to seismic prospection, vocalisations by animals (e.g. whales or dolphins), or small-scale weather observations. The key research questions are in the processing and analysing the vast amounts of continuous data and in deciding the best time scale to look at specific processes.

Measure-valued martingales and applications, Dan NG

Supervisors: Alex Cox and Johannes Zimmer

Measure-valued martingales are stochastic processes in the space of probability measures which have certain nice martingale properties. They have applications in mathematical finance such as the model-independent pricing and hedging of options. There are natural links to optimal transport and construction of gradient flows for measure-valued processes. They also set up a framework to interpret classical inequalities such as the Log Sobolev Inequality. The aim of Daniel's PhD project is to establish some basic properties of such processes, and to consider variational methods for their construction.

Large scale differential geometric MCMC, Tom Pennington

Supervisors: Karim Anaya-Izquierdo and Rob Scheichl

Uncertainty Quantification (UQ) concerns both propagation of uncertainty through a physical model, known as the forward problem, and the inverse problem of inferring uncertain model parameters from noisy measurements. Markov Chain Monte Carlo (MCMC) methods are the most widely used tools for computing expectations in UQ and large statistical models in general. Conventional approaches to MCMC are often inefficient and must compute many samples for a high accuracy. Geometric ideas can be used to improve the methods' statistical performance; two prominent algorithms in this line of thinking are Riemann Manifold Hamiltonian Monte Carlo (RMHMC) and Riemann Manifold Metropolis Adjusted Langevin Algorithm (RMMALA). Tom is interested in extending these ideas to exploit more general ideas from differential geometry, with a focus on developing methods that are suited to problems from UQ.

Optimising First in Human trials, Lizzi Pitt

Supervisors: Chris Jennison and Chris Harbron

Lizzi's project involves developing the statistical methodology used to design and make decisions in Phase I/First in Human clinical trials and is in collaboration with Roche. This is the first stage of testing a potential new treatment in humans, after extensive laboratory testing. The primary aim is to establish the associated safety and tolerability in order to define the range of doses to be tested in phase II. Clinical trials are expensive and time consuming, thus research into optimising this process aims to reduce the number of people required, the duration and the cost. Lizzi is looking to develop existing model-based Bayesian dose finding methodology such as the Continual Reassessment Method with this in mind. She is investigating properties of trial designs through simulation to ensure a design is both statistically robust and fit for practical use, thus appealing to clinicians. Traditionally, at this stage there is no evaluation of whether or not the treatment works. Lizzi's research is therefore incorporating analysing an early signal of efficacy into the trial design. Furthermore, the majority of existing research in this area focuses on oncology, thus Lizzi's is centring on a different therapeutic area.

Fast iterative regularisation methods, Malena Sabate Landman

Supervisor: Silvia Gazzola

Malena’s project is based on the study of fast iterative regularisation methods, with a particular focus on Krylov subspace methods and novel ways for determining regularisation operators and regularisation parameters. These tools are widely used in solving inverse problems, which are challenging as they can be large scale and severely ill-posed. As an example, Malena is exploring different imaging applications, such as tomography or deblurring and denoising of images.

Spatial branching processes, Tsogzolmaa Saizmaa

Supervisor: Andreas Kyprianou

Tsoogii’s project belongs to the field of spatial branching processes focusing on the exit measure induced by the limit of branching mechanisms of isotropic stable Lévy-processes. Specifically, the spatial arrangement of mass of a d-dimensional isotropic super-stable process as it first exits an increasing sequence of balls is being studied. The location of mass in the exit measure is being explored via the overshoot of an embedded isotropic stable branching process and its radii-dependent branching mechanism will be characterised. Convergence of this space-time stochastic process is explored as time goes to infinity.

Hybrid models in biology, Cameron Smith

Supervisor: Kit Yates

Spatial hybrid models are emerging methods used to simulate biological, chemical and physical phenomena on multiple scale levels. These methods take different models of the same system and at varying spatial resolutions, and employ them concurrently in different regions of the spatial domain. The main purpose of such hybrid models is to utilise the efficiency of coarser methods, whilst maintaining accuracy by using the finer methods where necessary. Cameron is developing various spatial hybrid models for biological processes in order to gain insight into how the underlying systems behave. Focusing initially on reaction-diffusion systems, which can be used to model many biological systems, from cell migration to the intracellular calcium dynamics, he is incorporating biological realism into such methods.

Raising the roof: extension of the Met Office's Unified Model into the mesosphere and lower thermosphere, Matthew Griffith

Supervisors: Chris Budd, Nick Mitchell, David Jackson and John Thuburn

Forecasting weather in the lower thermosphere (85 – 120 km) is of particular interest due to its impact on spacecraft re-entry and radio communications. To this end, Matthew is extending the current 85 km upper boundary on the Met Office's Unified Model (UM) to a height of around 120 km. Thus, he is raising the roof on current numerical weather prediction and paving the way for the development of a coupled whole atmosphere model. In particular, the work focuses on including the correct physical processes in the high atmosphere. This includes accurately depicting the reversal of the mesospheric zonal jets, forced by gravity waves (GWs). In order to do this, tuning of the GW forcing schemes is required, which is performed by a comparison with radar and satellite data collected by the Department of Electronic & Electrical Engineering.

Monte Carlo methods for the neutron transport equation via branching processes, Emma Horton

Supervisors: Andreas Kyprianou and Paul Smith

The neutron transport equation (NTE) is a balance equation that describes the flux of neutrons in inhomogeneous fissile mediums such as nuclear reactors. Working in collaboration with Wood plc, Emma is modelling nuclear fission reactions via the probabilistic theory of Markov branching processes in order to both unify existing theory and develop new theoretical and numerical techniques that allow her to study these processes in full generality. In particular, Emma aims to prove the existence of the leading eigenvalue and its corresponding eigenfunction, allowing her to study the limiting behaviour of the system of particles in different regimes. The methods developed in this project will also allow for more efficient simulations of these processes, which will provide a greater depth of understanding of such systems for the purpose of safety and optimal reactor design.

Optimisation of wireless router location, Hayley Wragg

Supervisors: Chris Budd, Robert Watson and Keith Briggs

Recent developments in high frequency antennas for wireless communication could enable users to have stronger connections. However, these high frequencies within new technologies do not travel through objects as well as the lower frequencies do. In the past, propagation models for indoor wireless communications have not been needed and when used often rely on measurements that are specific to one environment. Hayley is developing a mathematical model from Maxwell's equations to predict the strength of propagation that can be used to optimise the source location. This model will account for variation in the environment and will therefore be relevant outside of one specific location, unlike most of the current models.

Students starting in 2017

Systemic sclerosis: including prevalent and incident exposures in order to evaluate effects on cancer risk, Eleanor Barry

Supervisors: Anita McGrogan and Jonathan Bartlett

Systemic sclerosis (SSc), or scleroderma, is a long-term condition that causes thickening and hardening of the skin due to a build-up of collagen. SSc can also affect internal organs such as the kidneys, heart, lungs and gastrointestinal tract. It is believed that there is a possible link between SSc and other serious health conditions, and Barry's PhD explores the association between SSc and the occurrence of serious outcomes compared to people who do not have SSc. Working with the Department of Pharmacy and Pharmacology, she is focusing on statistical techniques used to minimise errors when estimating effects of SSc on occurrence of cancer.

Stochastic analysis, rough paths, and conservation laws, Stefano Bruno

Supervisors: Hendrik Weber and Tony Shardlow

Stefano's project aims to further the stochastic analysis of the stochastic PDE known as Dean's equation. This is an example of a stochastic conservation law, which is significantly challenging because of a square-root term in the noise coefficient, which is non-Lipschitz and requires non-negative arguments. The divergence operator is also applied to the noise, leading to poor regularity and making it difficult for classical solution methods. Stefano will look at recent developments in the theory of stochastic conservation laws, using the kinetic formulation and using ideas from rough-path theory, with a view to applying these ideas to Dean's equation.

Stochastic differential equations and machine learning, Teo Deveney

Supervisor: Tony Shardlow and Eike Müller

Statistical machine learning and neural network methodologies have seen significant development in recent years with the advent of faster computation and the discovery of efficient optimisation algorithms. Methods based on such techniques have provided state-of-the-art results in many high dimensional data tasks, such as image and speech recognition, artificial intelligence, and more recently, in applied mathematics problems. This project is leveraging developments in machine learning to improve methodologies for stochastic differential equations, with particular attention paid to applications in contaminant dispersal. Teo is investigating how deep learning and Bayesian methods can be used to solve a range of problems in this area, such as inferring appropriate PDE and SDE models from contaminant dispersal data, and efficiently approximating solutions to the high dimensional Fokker-Planck equations associate with current models.

Some mathematical and numerical problems in seismic imaging, Shaunagh Downing

Supervisors: Ivan Graham, Euan Spence and Evren Yarman

Shaunagh's project, in collaboration with industrial partner Schlumberger, concerns the numerical analysis of wave propagation problems and applications to marine seismic exploration. As part of the seismic exploration process, acoustic waves are emitted from a source into the earth. These waves are then reflected from the subsurface and measured by sensors. The relationship between the earth's subsurface and the measurements are mathematically modelled by partial differential equations (PDEs). Given the measurements, the properties of the subsurface can be inferred from the numerical solution of these PDEs to obtain a detailed image of the subsurface. This is then used to select and drill exploration and production wells. In seismic exploration, a problem of great practical interest is that of optimal sensor placement and this project explores how, if given prior information about the likely make-up of the subsurface (in the form of a class of generic models), the location of the sensors can be optimised to retrieve sufficient information about the subsurface.

Multi-particle diffusion limited aggregation, Tom Finn

Supervisor: Alexandre Stauffer

Multi-particle diffusion limited aggregation (MDLA) was formulated as a tractable model for dendritic growth. Unfortunately, geometric and dynamic properties of it have evaded a strong mathematical treatment for decades and understanding the behaviour of MDLA remains an open challenge. For example, under certain parameters MDLA may observe some limiting shape at macroscopic scales, but at the mesoscopic and microscopic scales will have complex and fractal-like structure. A competition model called 'first passage percolation in a hostile environment' (FPPHE) has been successfully coupled with MDLA to show a phase of linear growth exists. Tom's project investigates these links further and attempts to prove stronger results for FPPHE, such as the existence of a 'co-existence' phase between the competing growth processes. The project also aims to understand variants of MDLA better, such as a Poissonized version of MDLA, whereby there is initially a Poisson cloud of particles, and each particle performs a random walk until aggregated. In one dimension the critical value for the initial density is 1 for linear growth, but in higher dimensions it is conjectured to be 0, and this project aims to prove this and related results.

Asymptotic and numerical analysis of wave propagation in photonic fibres with a thin-structure cladding, Will Graham

Supervisors: Kirill Cherednichenko and David Bird

Optical fibres are widely used in telecommunications systems across the world. Photonic crystal fibres are a relatively new development, as they have the potential to provide all of the same service (but better) and more uses than conventional optical fibres. This is because photonic crystal fibres have microstructure at the same length-scale as the wavelength of the light passed through it, which allows for the light to be controlled in more ways. Will is analysing periodic “thin-structure” problems that describe the propagation of light through photonic crystal fibres: understanding the spectrum of these problems and their effective “limit problems” can better inform the design or use of such fibres.

Convergence of the three-dimensional Ising-Kac model to Φ 34, Paolo Grazieschi

Supervisor: Hendrik Weber

The Ising model is a classical particle system model in statistical physics, where interaction among particles happens at a nearest-neighbour level. If this interaction becomes “mesoscopic”, for example by introducing a radius of interaction which is longer than the microscopic scale and smaller than the macroscopic one, it is possible to prove convergence of the solution to the ϕ4 stochastic differential equation in the two-dimensional torus. The three-dimensional problem poses new challenges, due to the higher irregularity of the noise and to the arising difficulty in defining the limit equation itself. As such, this problem requires the use of recent new powerful techniques like the theory of Regularity Structures. In his PhD, Paolo is focusing on building a framework which makes it possible to treat the discrete particle system in the three-dimensional torus and to prove its convergence to the Φ 34 stochastic differential equation.

Singular stochastic partial differential equations, Trishen Gunaratnam

Supervisor: Hendrik Weber

Trishen’s research is in the field of singular stochastic partial differential equations. The equations that he is interested in have connections with Euclidean quantum field theory and statistical physics. There has been substantial progress and exciting activity in this field in recent years.

Echo State Networks and their application to dynamical systems, Allen Hart

Supervisors: James Hook and Jonathan Dawes

Allen is studying how well a particular recurrent neural network architecture called the Echo State Network (ESN) can approximate dynamical systems, predicting their future behaviour as well as inferring their topological features. Allen hopes to use ideas from Takens' Embedding Theorem to prove that an ESN trained on a time series of low dimensional observations of a high dimensional dynamical system can learn the topology of the high dimensional system. Having learned the topology to some level of precision, the ideas from the Universal Approximation Theorem could be deployed to prove that a sufficiently large ESN trained on sufficiently many data can predict the future dynamics of a system arbitrarily well. Numerical experiments will also provide some intuition about how well practical ESNs perform on example dynamical systems like the Lorenz, or Mackey-Glass systems.

Numerical and analytical approaches using complex ray theory and exponential asymptotics in 3D wave-structure interactions, Yyanis Johnson-Llambias

Supervisor: Philippe Trinh

Despite significant advances in computational hardware and numerical algorithms, the simulation of fully nonlinear three-dimensional free-surface flows around blunt-bodied objects remains particularly limited. On account of the processing power required, most modern desktop (and in some cases high-performance) computations still require the use of simplifying geometrical assumptions and coarse meshes on the order of a hundred points per spatial dimension. In contrast, numerical simulations of comparable two-dimensional flows can be routinely done with O(1000) grid points in the spatial dimension. There continues to be a need for the analytical theories that can provide explicit asymptotic descriptions of the flow properties, particularly for the use of efficient hybrid numerical-analytical approaches. Recently, there has been success in developing new asymptotic techniques for studying linear wave-structure flows in three-dimensions. These techniques are based on the use of exponential asymptotics applied to low-speed hydrodynamical flows. Yyanis's research develops new analytical and numerical techniques related to the area of complex ray theory and asymptotic analysis, to extend these ideas to nonlinear problems.

Market microstructure, flash crashes and market manipulation, Kevin Olding

Supervisor: Alex Cox

The aim of market microstructure modelling is to construct models which capture the ecosphere of participants in financial markets involved in high-frequency trading, such as informed investors, market makers and uninformed or ‘noise’ traders. Such models should be internally consistent, in that all market participants act optimally to solve stochastic optimisation problems, but may also contain features which provide opportunities for a single large trader to manipulate the market. Automatic or algorithmic trades may also inadvertently converge on strategies which have a similar impact. Whilst generating short term profits, such a trader or algorithm could cause instability in the market, leading to a loss of liquidity or a ‘mini-flash crash’. Kevin is looking to construct simple models which reflect accurately the ways in which liquidity is provided to, and prices are set in, financial markets and to understand the circumstances that might lead short term trading algorithms to disrupt ordinary market conditions.

Complexity-based selection of large-scale network models, Lizhi Zhang

Supervisor: Tiago Peixoto

The large-scale structure of real-world network systems cannot be directly obtained by inspection, and require instead robust methods of description and extraction. One common approach is to identify modules or "communities" via the statistical inference of generative models. Despite significant recent work in this direction, most existing methods rely on simplistic assumptions that disregard dynamical aspects of the network generation, and do not contain domain-specific information about the most likely mixing patterns. Lizhi is developing general tools applicable when the network grows over time (e.g. a citation network, or the world-wide-web), or when it contain heterogeneous assortative/disassortative mixing patterns (e.g. social networks).

Spatial fragmentations, Alice Callegaro

Supervisors: Matt Roberts and Marcel Ortgiese

Fragmentation, the breaking up of large structures into smaller pieces, occurs naturally in many situations, from earthquakes to hard drives. The mathematical definition of a fragmentation process involves an object that breaks up at random into smaller pieces, which then break up themselves, and so on; but with the rule that the way in which a piece breaks up must depend only on its size. This condition is a huge simplification which allows rigorous study, but prevents traditional mathematical models from accurately representing the vast array of real-life possibilities. In her PhD, Alice is focusing on spatial fragmentations, in which the speed at which pieces fragment depends on their shape in a non-trivial way.

Modern Statistical techniques for assessing and predicting herbicide performance, Arron Gosnell

Supervisor: Evangelos Evangelou and Kostas Papachristos

Typically, thousands of potential herbicides will undergo a sequence of screening tests (assay tests) in the lab. Each time ineffective compounds will be discarded and those remaining are assessed against a more complex set of criteria, with the final few undergoing rigorous field trials. Evidently, the data from the early trials will exhibit high uncertainty and subjectivity. In most applications, a herbicide is assessed against a range of criteria. Therefore, a method to combine multiple criteria according to their significance for scoring each herbicide is required. Arran's research involves creating a model to predict the herbicide’s performance on each test using information such as dosage, plant species, and the chemical’s structure which can be presented as a graph. Modern regression methods such as support vector regression, neural networks, and Gaussian process regression are employed to exploit the relationships between plant species and families of chemicals in order to improve predictive performance.

Estimating the Frequency of Extreme Events in the Presence of Non-Systematic Records, Tom Smith

Supervisors: Simon Shaw, Thomas Kjeldsen, Ilaria Prosdocimi, and Sean Longfield

Extreme flood events can be devastating, so having good estimates of how often floods of a given size might occur at a specified location is of clear importance. However, the systematically-collected river flow time series from which these estimates may be derived are short, being typically just 40-50 years long in the UK. Consequently, the flood frequency estimates have large uncertainties. The systematic record may be extended by utilising non-systematic records such as newspaper reports, photographs, and flood marks carved into buildings. Working with the Environment Agency, Tom is developing methodology to allow these non-systematic records to be routinely used in flood frequency analyses, with a particular focus on the importance of accounting for the many sources of uncertainty that such an analysis involves. He is also investigating the utility of non-systematic records in 'regional' flood frequency analysis, wherein river flow series from hydrologically similar catchments are combined in order to reduce uncertainty. The methodology developed during this research will be applicable to other natural hazards.

Phase 3 clinical trial statistics, Abigail Verschueren

Supervisors: Chris Jennison and Lisa Hampson (Novartis)

Clinical trials are composed of four stages, each of which has a different primary aim. This project focuses on Phase 3; the drug is already deemed safe, the dosage decided and the focus being efficacy and futility. The development of pharmaceuticals and medicines across all phases relies heavily on statistical methodology and accuracy, with Phase 3 summarised by a single hypothesis test for the difference in size of treatment effects. Patient safety and well-being are central to the design process. Abigail's project considers group sequential trials, a mechanism introducing interim analyses and allowing for a trial to be stopped early for either efficacy or futility. The aspiration is that overall, less patients receive the less effective drug. For the analysis of clinical trials, a primary endpoint must be specified, this is the measurement of interest that is affected by the drug; for example this project focuses on survival or time-to-event as the primary endpoint. There has also been copious recent research on "biomarkers" which are underlying processes in the body that may be predictive or informative of the primary endpoint. Working with Novartis, Abigail is researching a joint model for the two processes and investigating the gain to be made when biomarkers are included in a group sequential trial due to the increase in information.

Students starting in 2018

Adaptive Semi-implicit Semi-Lagrangian (SISL) method for the Shallow Water System, Simone Appella

Supervisor: Chris Budd

Numerical Weather prediction is an essential component to weather forecasting and climate modelling. It is based on the design of accurate and efficient numerical schemes to simulate the motion of ocean and atmosphere. In such context, explicit numerical methods have to satisfy the CFL condition, which imposes a strict time step restriction, in order to be stable. To overcome this limit, the Met Office is currently adopting the Semi-Implicit, Semi-Lagrangian method (SISL), which permits the use of larger time steps without stability issues. However, many global meteorological phenomena of relevance (storms, tsunami) occur on a scale smaller than 25km, that cannot be efficiently resolved by SISL with an uniform grid. A natural way to fix this is to cluster the mesh points in proximity of small features evolving in time. Such adaptive methods, though, are inefficient to use because are either unstable or require small time steps. This issue can be avoided by coupling them with a SISL method. Simone will investigate the adaptive SISL scheme applied on the Shallow Water system, that models the shallow atmosphere. He will start to examine the accuracy and stability of this method in the 1D case. This will be then extended to 2/3 dimensions based on the optimal transport moving mesh strategy.

The effects of climate variation on cocoa farming in Ghana, Oluwatosin Babasola

Supervisor: Chris Budd

Cocoa farming in Ghana is affected by a number of different factors including climate variation. These lead to uncertainties in planning for the future, both in the short term (seeding and harvesting) and in the long term (allocation of land use). The planning process can be helped by the construction of appropriate mathematical models and linking these to data on climate variation. Tosin's project involves constructing such models and comparing them with data on cocoa production and local climate variation. The initial work will look at delay differential models for crop production linked to sun and rainfall prediction.

Accessibility percolation in dynamic fitness landscapes, Thomas Bartos

Supervisors: Marcel Ortgiese and Tiffany Taylor

The fitness landscape concept originates in evolutionary biology as a metaphor to describe adaptive pathways taken by populations, where fitness is thought of as a height above a multidimensional genotypic space. Such landscapes can be modelled by representing the genotype space as a finite graph and the fitness function as a random vector associated to the vertices of the graph. Researchers have recently made progress in understanding the statistical properties of the number of so-called ‘accessible paths’ - sequences of adjacent vertices of increasing fitness. Thomas is studying how the accessibility properties of a graph are affected by a time-varying fitness function. The first case he is looking at is where the fitness values are resampled according to a Markov chain; to analyse this he is adapting methods from first-passage percolation and dynamic percolation theory. Other situations to be considered include continuously changing landscapes and pairs of interacting landscapes. The aim is also to use the theoretical results obtained to explain the observed patterns of evolution of flagella in a recent experimental study of particular bacterial strains under varying selective pressures.

Monte Carlo Methods for Particle Systems and the Neutron Transport Equation, Tom Davis

Supervisors: Alex Cox and Andreas Kyprianou

The flux of neutrons in a nuclear reactor has traditionally been modelled and understood using deterministic numerical methods. Recent probabilistic treatments have justified the use of stochastic processes to model neutron flux; Monte Carlo methods can then be used to simulate these processes. However, different Monte Carlo estimators suffer from various pitfalls: some techniques have a high per-iteration cost and are very difficult to parallelise, while other estimators have lower cost per individual simulation but an unacceptably high variance. Tom’s research project will investigate ways in which the neutron particle system may be simulated, using Monte Carlo algorithms which are both efficient and amenable to parallelisation. He also hopes to establish convergence, and other theoretical properties, of the resulting algorithms.

Generative Models Applied to Inverse Problems, Margaret Duff

Supervisors: Matthias Ehrhardt and Neill Campbell

An inverse problem is the process of calculating from a set of observations, the data, the causal factors that produced them, the model parameters. Inverse problems that are interesting are nearly always ill-posed, meaning that small errors in the data may lead to large errors in the model parameter and there are several possible model parameter values that are consistent with the observations. Addressing this ill-posedness is critical in applications where decision making is based on the recovered model parameter, such as for medical imaging. Medical images remain the gold standard for diagnostics of many conditions. However, analysis of medical images raises fundamental issues with the standard "deep learning" approach of training a multi-layer neural network on hundreds of thousands of images. Such algorithms cannot accurately quantify their uncertainty, nor describe the reasoning that led to a given classification for an image. Generative models provide a promising avenue to solve the aforementioned problems. They implicitly model high-dimensional distributions of data from noisy indirect observations. From this, new samples from the distribution could be generated and estimators calculated. However, in pushing the boundaries of computer science, fundamental mathematics has somewhat been left behind and this threatens the ability to exploit the many applications of these methods. Margaret’s project aims to fill some of these mathematical gaps.

Stochastic Optimal Control for problems arising in data science, Marco Murtinu

Supervisors: Alex Cox and Kari Heine

Stochastic optimal control is a sub field of control theory that deals with the existence of uncertainty either in observations or in the noise that drives the system. Its aim is to design the path of the controlled variables that performs the desired task with minimum cost. Marco is studying problems where there is a data collection process, and a decision based on this information such as when to stop, or what information to collect. These problems are related to sequential detection problems and to uncertain stochastic control problems, where the law of the stochastic process driving the system is not known. Marco plans to investigate how these problems are related and to explore possible applications in data science, where it is fundamental to understand when, during a sampling procedure, enough samples have been collected to make a decision.

Statistical applied mathematics for air quality prediction, Laura Oporto

Supervisors: Paul Milewski, Theresa Smith and Gerhard Schagerl

Laura's project involves determining the impact of traffic on local air quality and distinguishing the different sources of pollution. Her project is in collaboration with AVL. Local air quality management can be complex due to the different emission sources such as transport, buildings, industry and more. Also weather conditions, local topography and emissions from distant places may influence pollutant transport and amplify pollution in certain locations. Her project has three main parts: a deterministic approach, where she's looking at a fluid and chemical modelling to predict pollutant concentrations; a statistical model to analyse data from pollution and traffic monitors; and methods combining these two to calibrate parameters, improve the accuracy of the model, and extend prediction horizons.

Faraday wave-droplet dynamics: Stochastic analysis of the droplet trajectory, Eileen Russell

Supervisors: Paul Milewski, Tim Rogers and Carlos Galeano Rios

For a suitably vibrating bath of fluid, a small droplet of the same fluid will “walk" across the surface due to the propulsive interactions with the waves generated by the previous droplet-bath impacts. As the bath vertically vibrates at a sufficiently large (yet sub-critical) amplitude, the droplet will bounce periodically. The bouncing droplet does not make contact with the bath, it is instead propelled back into the air due to the cushioning effect of the lubrication layer of air trapped between the bath and droplet visible only on a microscopic scale. As the amplitude increases, the droplet will destabilise and receive a “kick" in the horizontal direction resulting in the drop walking along the surface. Increasing the forcing vibration will increase the Faraday wave's decay time yielding a longer path “memory" from previous drop impacts. In an infinite domain, the walking droplet will continue in a straight line at a constant speed. If we confine the droplet to a coral, a wavelike statistical pattern emerges from the complex trajectory. Eileen will derive and analyse this pattern using a combination of fluid mechanics and stochastics.

N-particle branching random walk, Zsófia Talyigás

Supervisors: Sarah Penington and Matthew Roberts

Branching random walks are well-studied models in probability theory. In particular, the N-particle branching random walk (N-BRW), a branching random walk with selection, was first studied in the physics literature as a stochastic model for front propagation. In the N-BRW, at each time step, N particles have locations on the real line. Each of the N particles has two offspring, which have a random displacement from the location of their parent according to some fixed jump distribution. Then among the 2N offspring particles, only the N rightmost particles survive to form the next generation. The main purpose of Zsófia’s project is to explore the long-term behaviour of this process with different jump distributions. There has been substantial recent progress in this area, but many interesting questions remain, including the behaviour of the system when the jump distribution has stretched exponential tails.

Prediction of Biological Meta-Data using DNA Sequences, Jordan Taylor

Supervisors: Sandipan Roy, Matt Nunes and Lauren Cowley

A human base-paired DNA sequence is approximately 3 million letters in length and consists of the letters “G”, “A”, “T”, and “C”. With the multitude of different combinations, it is of interest to find whether, given a subset of much smaller sub-sequences, one can infer that an individual caught some bacteria from a particular country or any other associated meta-data. The bacteria of interest initially include Shiga Toxigenic Escherichia coli and Salmonella. Though machine learning methods have shown recently to provide state-of-the-art results in many domain areas in high dimensional space, many of the methods are seen as “black boxes” with little interpretability. Jordan is investigating explainable machine learning and statistical methods in context of both prediction and ranking of the significance of the sub-sequences.

Antagonistic coevolution in multi-species interactions, Jason Wood

Supervisors: Ben Ashby and Nicholas Priest

Jason's initial aim is to create models for the coevolutionary dynamics of hosts and parasites are changed by the addition of a hyperparasite. These are two subsets of host-parasite-hyperparasite systems, which are often found in nature (e.g. viruses which infect bacteria, hyperparasitic fungi) and which can have medical or industrial applications, for instance phage therapy. This work will then be extended to focus on vector-borne diseases.

Hybrid methods for modelling the cell-division cycle, Josh Young

Supervisor: Kit Yates

Biological systems exhibit a tremendously wide variety of behaviours at many different spatial scales. While in theory, the behaviour of a system at any scale can be viewed as emergent from the behaviour of its smallest components, deriving these scale relationships is analytically intractable except in only very carefully constructed examples. Numerical methods based purely on a system's microscopic behaviour can quickly become cost-prohibitive as the number of atomic components of a system increases. To achieve computational feasibility, different modelling regimes are used at different scales, though at the cost of information loss when using coarser representations. Josh's work is concerned with hybrid methods, which combine multiple regimes to balance the advantages and disadvantages of each. In particular, he aims to construct a hybrid method based on a model of the cell-division cycle as a reaction-diffusion system, building upon previous hybrid methods such as the pseudo-compartment method.

On-line drill system parameter estimation and hazardous event detection, Dan Burrows

Supervisors: Kari Heine, Mark Opmeer and Inês Cecilio

Dan's research, in collaboration with Schlumberger, develops statistical methods for automatic detection of hazardous events in oil and gas drilling operations. Initially, only two particular hazardous events are considered. The first is called washout and it means the appearance of a hole in the drill pipe which may compromise the safety and efficiency of the operation as well as equipment integrity. The second event is called mud loss and it means the loss of drill fluid due to a leakage in the drill system to the surrounding rock formation. As the project progresses, more complex scenarios will be considered, involving multiphase flow, influx of gas from the formation, accumulation of rock cuttings around the drill pipe, wear of the drill bit, plugged bit nozzles, or the degradation of the motor. The initially one dimensional model could also be extended to two or three dimensions for increased accuracy.

Bayesian inference for low-resolution Nuclear Magnetic Resonance in porous media, Michele Firmo

Supervisors: Silvia Gazzola, Tony Shardlow and Edmund Fordham

Nuclear Magnetic Resonance is used to infer properties of porous media, such as rocks, through which oil can be extracted. Michele's research project aims to surpass the current standard inference methodology by providing uncertainty estimates alongside state estimates in an efficient manner and to develop the technique for shales. Working with Schlumberger, this will be achieved through reformulating the problem in a Bayesian framework and applying tools from numerical linear algebra.

Mathematical modelling of formulation composition trade-offs for pesticides, Jenny Delos Reyes

Supervisors: Jane White, Begona Delgado-Charro and Josh Fernandes

Creating validated mathematical models that can inform the process of risk assessment during pesticide product development is an industry-wide aspiration. It is particularly challenging given the wide range of formulations that may be used to produce new pesticides and the complexity of developing products that have good foliar uptake but poor dermal absorption. Working with Syngenta, Jenny is developing and analysing a series of spatially explicit mathematical models for membrane penetration parameterised using existing data sets. The impact of formulation products is explored in relation to their physicochemical properties in an attempt to categorise formulation impact across the two membranes. The models will subsequently be combined and analysed within a novel optimisation framework which should highlight the key parameter groupings responsible for good foliar uptake and poor dermal absorption based on existing data sets.

Students starting in 2019

The role of precursors of active regions in space weather forecasting: reliably predicting CMEs and SEPs before their occurrence with the help of machine learning, Tina Zhou

Supervisors: Chris Budd, Apala Majumdar, Silvia Gazzola and Tom Fincham Haines.

The Met Office produces real-time operational space weather forecasts: severe space weather has appeared on the UK National Risk Register since 2011. Space weather can have major impacts on UK and international critical infrastructure (e.g. the electricity grid, satellites, aviation, Global Navigation Satellite System (GNSS) positioning, navigation and timing, radio communications) and on human health. The Met Office is constantly striving to produce improved space weather forecasts to meet their customers’ needs, but this can be very challenging. The difficulty of solar weather forecasting is due to the uncertainty of solar movement and speed of arrival of an observed event (a few minutes or up to a few days before arrival on Earth). The focus of this project is to develop improved solar weather modelling forecasts based on the analysis of solar and near-Earth space observation data. New data from the spacecraft Parker will be available for research and there is no doubt that Parker can introduce many new results. Part of the approach will use machine learning to train proper models for better forecasting results. Additionally there is the possibility of combining methodologies in applicable probability, statistics theories and methods (for example time series analysis, Markov process and Bayesian simulations) and maths modelling methods in areas of numerical PDEs or stochastic PDEs. Most of those aspects are new and based on several recent technology break-troughs. The output of this project is to establish new methods in tracking space weather forecasts and seek new conclusions from observations by those new technologies.

Design principles for active solids, Guido Baardink

Supervisor: Anton Souslov

Metamaterials are designed materials whose microscopic geometry is carefully engineered to achieve macroscopic properties not readily found in nature. Guido's research focuses on mechanical metamaterials, modelled as large ball-spring networks, and aims to find general design principles for modifying the geometry and topology of the network to attain a variety of unusual mechanical responses. In particular, he is interested in how introducing active mechanical components (e.g., motors that inject energy) into the network can lead to novel materials for applications in such areas as shock absorption and mechanical actuation.