Dissertation projects for the MSc by Research in Astronomy and Astrophysics
Below are some of the MSc(R) projects for this coming year. However, we encourage prospective MSc(R) students to speak with as many of our academic staff, as early as possible, to find out what we do and to see if there are possible MSc(R) projects available - this list is unlikely to be comprehensive and research projects can quickly become out-of-date.
Prospective students should contact potential supervisors when they apply for admissions. Usually, projects are allocated when offers are made but can be changed within the first weeks of the start of the programme. Please contact the MSc(R) course director, Laura Wolz if you have any questions.
Supervisor: Michael Wilensky, Phil Bull
In this era of precision cosmology, multiple independent measurements of the same physical quantity can sometimes throw up surprises -- particularly when the measurements don't agree with one another to high significance. An example of such a "cosmic tension" involves the Hubble parameter, measured independently by observations at low redshift, and inferred via model fitting to high-redshift data from the Cosmic Microwave Background (e.g. Paper - "Hubble constant hunter's guide" ). A large and statistically-significant discrepancy is observed between the high- and low-redshift measurements, which differ by several km/s/Mpc, with errorbars of approximately 1 km/s/Mpc.
Understanding the cause of this tension, and others like it, is a difficult problem, as there are many possible explanations, very few of which seem to offer a compelling solution. One problem is to understand which measurements are the most 'trustworthy', and which are more likely to be affected by systematic errors which biases their reported values. The aim of this project is to apply a so-called 'jackknife' technique on measurements from a variety of sources and methods to try and understand which subsets are statistically consistent with one another, with the goal of identifying which data are more or less reliable, and whether a tension really exists at all. The technique itself uses a Bayesian statistical framework to systematically study different subsets ('splits') of the data, allowing us to compare their properties in a well-posed, statistically-rigorous way.
This project will primarily be carried out using a purpose-designed Python library, which will only need minor modifications. The main programming component will involve writing a code to generate simple simulations with suitable statistical properties that can be used to compare with results from the Python library, which applies the Bayesian jackknife technique to input data. You will need to write scripts/Python notebooks to generate the simulated data, and run the Bayesian jackknife test on both real and simulated data, including various ways of visualising the information that it outputs. You will also need to collate data from various sources, and perform a literature review to try and identify possible solutions to the tension that match what you observe from the outputs of the tests. In addition to applying this technique on Hubble parameter measurements, other astrophysical quantities can also be studied in a similar manner, including measurements of other cosmological parameters and the like.
Supervisor: Anna Scaife
Standard neural networks optimise point-wise estimates of the weights for each of their neurons. In contrast Bayesian neural networks assign a probability distribution to each weight instead of a single value or point estimate. These probability distributions represent the uncertainty in the weights and are propagated through the network to provide an uncertainty on the final predictions. This has been reported to reduce over-fitting to specific training data sets and to allow such networks to train more general models from fewer data samples. However, previous applications of fully Bayesian neural networks to radio galaxy classification have shown that such models suffer from a "cold posterior" problem, where the complexity cost in the loss function needs to be down-weighted in order for the model to perform well. One possible reason for such an issue is that the likelihood probability has been mis-specified in the model due to the way in which the training data are labelled. Such likelihood misspecification can be a consequence of "consensus" labelling, where the labels on the training data have been assigned as average or modal values from a set of individual expert labels.
In this project the student will create a tool to relabel the MiraBest radio galaxy data set using individual information from a range of experts and use those labels to examine whether the cold posterior problem previously observed when training with this dataset can be explained/mitigated by retaining a full set of individual labels, rather than using the consensus values to label data. This project will require good computing skills and previous experience with Python/PyTorch is desirable.
Supervisor: Phil Bull
Cosmic Dawn - the time when the first stars and galaxies switched on, a few hundred million years after the Big Bang -- remains shrouded in mystery. This is mainly due to the difficulty of making observations of this period, as bright sources are rare and very far away. A promising observational probe that can access this part of cosmic history is the 21cm line from neutral hydrogen, which has since been strongly redshifted to radio frequencies of around 70 - 100 MHz, and is relatively faint. Unfortunately, it is also heavily obscured by the bright "foreground" radio emission from our own Galaxy and others nearby. By observing at these frequencies with a very precisely calibrated radio spectrometer, we can hope to disentangle the foregrounds from the 21cm signal, and observe how its intensity, averaged over the sky, varies with redshift. In turn, this variation should allow us to infer which astrophysical processes were responsible for heating the intergalactic medium and contributing to the formation of the first galaxies ("Constraining the unexplored period between reionization and the dark ages with observations of the global 21 cm signal").
We are currently in the process of developing an experiment called RHINO that can make measurements of this "21cm global signal". It is consciously designed to have very different instrumental properties to other experiments that target the same signal (e.g. "An absorption profile centred at 78 megahertz in the sky-averaged spectrum", "On the detection of a cosmic dawn signal in the radio background", "The REACH radiometer for detecting the 21-cm hydrogen signal from redshift 7.5 to 28" ). As such, we need to make new simulations of the radio antenna and its performance, test different calibration and foreground removal strategies, monitor radio frequency interference, and build a prototype antenna and receiver system.
This project will give you the option of working on simulations, theoretical predictions, and/or hardware design and testing, according to your interests. You should ideally have some familiarity with Python or Matlab.
Supervisors: Albert Zijlstra, Iain McDonald
The GAIA satellite is in the process of mapping the precise position and motion of nearly every star in the Milky Way. It is measuring the parallax of every star and uses this to obtain an accurate distance. The GAIA DR3 catalogue (2020) contains 2 billion stars. An example of its use is at 'Our solar neighbourhood' video.
The current project aims at creating accurate 3-d HR diagrams (the third dimension being the location in the Milky Way) by deriving the temperature and luminosity for each star it this catalogue. This requires combining available photometry for each star from other catalogues, and fitting model atmospheres to the spectral energy distribution.
The results can be used to identify stars of particular interest, for instance stars with circumstellar dust fro their formation or their final evolution, or the study of interstellar extinction. Depending on the interest of the student, the focus can be on the software development or on the analysis.
Supervisor: Chris Conselice
This project will focus on discovering the earliest galaxies that formed in the universe and measuring their basic properties with the James Webb Space Telescope (JWST). Astronomers have long sought to find the earliest galaxies and to use them to determine how galaxies first formed. We can currently see galaxies back to when the universe was around 500 Million years old. We are, however, still learning about these galaxies and their basic properties such as their masses, sizes, and luminosities.
This project will use new and simulated data to search for and locate massive galaxies in the first billion years of the universe's history using JWST imaging. The project will involve using this imaging to discover these systems and then to make catalogues of objects. From this it will be possible to measure the masses of these galaxies and their number densities, putting constraints on the formation and galaxies and their role in the evolution of the universe.
Once we have identified the candidate distant galaxies using the Webb imaging, the student will characterise these systems in terms of the number of galaxies, their sizes, star formation rates, and masses. From this data, we can compare with theory to determine the processes driving the earliest galaxy formation.
This project can be done with a computer using preferably python coding.
Supervisor: Neal Jackson
Gravitational lenses are systems in which a background object is multiply imaged by the action of the gravitational field of a foreground galaxy. Gravitational lensing is important because it allows us magnified views of distant objects in the Universe, and also because it allows us to investigate mass distributions in galaxies independent of their light emission.
We are currently involved in a number of projects with major radio facilities (e-MERLIN and LOFAR) and planning for future surveys, including Euclid. Accordingly, there are a number of areas in which students could become involved:
- We are conducting LOFAR observations of a number of gravitational lenses to explore the properties of lensing galaxies. Because lenses give us multiple lines of sight through the galaxy, this allows us to deduce the effect of the lensing galaxy on the radio signal that propagates through it.
- We are currently conducting a survey called LBCS which provides calibrators for the analysis of the long baselines of LOFAR, which use the international stations outside the Netherlands. There are opportunities to get involved with the development of interferometric pipelines for the reduction of LOFAR-LB data.
- We have a number of projects involving radio observations of both radio-loud and radio-quiet lenses in order to study both lensing galaxies and the lensed sources (the latter are visible thanks to the magnification of the lensing galaxy).
- We are involved in a science working group of Euclid which is investigating the use of Euclid-discovered gravitational lenses for the study of galaxy evolution. The student will assist with the simulation of the scientific output from such a survey.
Intensity Mapping of the Neutral Hydrogen gas (HI) is a new cosmological survey type to observe the matter distribution. It uses the integrated 21cm emission of the HI gas in the radio wavelength to map the largest scales of the matter distribution with radio telescopes. The resulting HI intensity maps trace the underlying structure and therefore can be used to test cosmology, for example, by measuring the expansion rate of the Universe via Baryon Acoustic Oscillation (BAO) detection.
The South African Square Kilometre Array pathfinder MeerKAT is an interferometric array consisting of 64 dishes, which can be used in so-called single dish mode for HI intensity mapping observations. There are many observational challenges to overcome in order to detect the HI signal, such as subtracting astrophysical foregrounds and terrestrial radio frequency interference (RFI). In this project, we will examine the RFI present in the pilot HI intensity mapping data from MeerKAT in order to understand its statistical properties. We will then model the RFI in telescope simulations and test different methods to subtract them.
We will investigate the impact of the RFI properties and removal methods on the resulting HI intensity maps and the cosmological constraints.
Some programming experience in python or similar is essential, background in radio astronomy desirable.
Supervisor: Rowan Smith
Over the last decade our understanding of the interstellar medium of the Milky Way has been transformed by the discovery that it takes the form of a continuous hierarch of filaments from scales of kilo-parsecs, down to sub-pc scales. This profoundly shapes how gas is transformed into stars. Now the PHANGS-ALMA survey allows us to study the molecular gas of external galaxies at 1” resolution, meaning for the first time we can investigate the behaviour of filamentary gas in other galactic systems.
In this project you will analyse data from the PHANGS-ALMA survey of nearby galaxies to identify filamentary structures, characterise their length and mass, and compare to the empirical scaling laws identified in the Milky Way by a major review co-authored by the supervisor. This dataset has not yet been analysed in this way and you will work closely with other members of the international PHANGS collaboration. The analysis will use the FILFINDER technique to identify the filaments and subsequent original analysis will be carried out using python Jupyter notebooks. The final results will be compared to simulated data from the supervisors CLOUDFACTORY simulations, and so the project is suitable for those who are interested in both a theoretical and observational approach to astrophysics.
Looking for the techno-signatures of Advanced Technical Civilisations via anomalies in astronomical data
Supervisor: Michael Garrett, Andrew Siemion (UCB)
This project will search for anomalies in publicly available astronomical data that may be generated by the activities of an energy-intensive extraterrestrial civilisation. These so-called "techno-signatures" may take various forms but we will focus on unusual excess emission in the infrared domain due to the generation of waste-heat, in addition to other outliers e.g. discrepancies in spectroscopic v astrometric distances, excess stellar emission in the radio domain etc.
It may be possible to introduce some Machine Learning and AI aspects into the project. We will choose a well-defined topic with the aim of publishing the results in a refereed journal (see: Extending the Breakthrough Listen nearby star survey to other stellar objects in the field for a recent example).
Prior knowledge: it would be useful if the student has some background in astronomy and astrophysics but this is not essential. The student should be prepared to use software tools to interrogate large astronomical data sets, and to write python scripts to analyse the data.
Supervisor: Eamonn Kerins
Two commonly employed modes used in the Search for Extra-terrestrial Intelligence (SETI) are survey-mode SETI, where large areas of sky are swept for possible signals, and targeted SETI, where specific planetary systems are monitored. A potential "smart strategy" for targeted SETI is to target transiting planets from where Earth can also be seen as a transiting planet.
If an intelligent observer is present on the other planet then both they and we can know that we can see each other's potentially habitable planet.
This mutually detectable situation is a game theory "focal point" that could incentivize one civilization to send a signal to the other. In this project you will compute that frequency and characteristics of transits from potentially habitable planets close to the Earth's ecliptic plane. Your calculation will be based on the best currently available information you can find on the demographic distribution and characteristics of exoplanets. Your results will form the basis for designing a transit detection survey that would deliver an exoplanet sample optimized for targeted-SETI monitoring. This project is a computational one and so is ideal for someone who enjoys writing code in python or other common programming languages.
Supervisor: Paddy Leahy
POSSUM is one of several surveys being conducted with the ASKAP telescope, which is revolutionary in that it is the first aperture synthesis telescope equipped with “PAFs”, essentially the radio equivalent of a CCD chip. This gives it a very large field of view for a high-frequency telescope, about 36 square degrees, about 150 times more sky area than visible to the VLA at any one time. This makes it ideal for large, deep radio surveys, and POSSUM will cover 75% of the sky, in full polarization and over a 300 MHz bandwidth. POSSUM is an international project and is currently in its pilot phase. Its aim is to measure Faraday rotation, which depends on the magnetic field between the radio source and the Earth. There are all sorts of specific science goals but this MSc project is to study the magnetic field immediately surrounding the radio lobes of radio galaxies. These lobes are bubbles of relativistic plasma embedded in the 100-million kelvin gas that surround galaxies, especially in clusters of galaxies. We can isolate the magnetic fields in this gas from the rest of the line of sight because the field direction and strength, and hence the amount of Faraday rotation, varies with position across the lobes. The best targets are the radio lobes with largest apparent size, > 5 arcmin, for which ASKAP gives detailed maps. Typically there are only one or two objects so large in each ASKAP field. In 2021-2 POSSUM isl carrying out “Phase II pilot” observations of about 10 fields, which should yield 10 to 15 objects to analyse for the project. Routine data analysis is done automatically for the whole survey, so you will start with calibrated images. Your job is:
- To find a well-defined group of targets to analyse, by inspecting the images: these objects are so big that they don’t show up in automatic catalogues: they tend to be registered as several different sources and are most easily recognised by eye. The images are BIG, so this will take a week or two, plus probably a week or so of training.
- Do some literature research on each object: most of them will be known AGNs, with basic data catalogued such as magnitude of the galaxy and in many cases a redshift and therefore distance. Typically there is not much known about the radio source itself, although there are some exceptions which have been the subject of earlier studies.
- For each target, extract a cut-out from the data “cube” of maps at 288 different frequencies, including the target and some empty sky surrounding it to analyse the noise properties, and run the POSSUM RMTools code to make a Faraday analysis of the cutout. This yields maps showing the Faraday Rotation Measure (RM) in each pixel, along with estimates of the intrinsic polarization direction, fractional polarization, and information about the spread of RM within the pixel (“Faraday dispersion”).
- The hard bit is then to try to make sense of the results. You will look for patterns, e.g. marked asymmetry between the RMs of the two lobes (Laing-Garrington effect), which is likely due to the lobe on the far side of the galaxy having more gas in front of it, and there are also claims of systematic ripples in RM across each lobe, although these are not always seen. We can do some pencil-and-paper theoretical analysis to assess proposed explanations, such as the numerical simulations by Hardcastle and Krause. The overall aim is to learn more about the magnetic fields in the intergalactic gas and how that interacts with the jets from the AGN.
Supervisors: Lucio Piccirillo, Mark McCulloch
After almost 100 years since the beginning of radio-astronomy, the field has experienced a huge technological surge in terms of capability of detecting weaker and weaker radio signals, with higher spatial resolution and broader spectral coverage. Technology has improved in terms of imaging capabilities especially with big interferometers - like ALMA - or many imaging arrays at the focal plane of single telescopes. Modern radio astronomy has improved mostly in terms of larger and larger instruments (FAST as a single dish and SKA as interferometer, just to cite two examples). A corresponding improvement in the noise of the radio receiver has not happened mostly because amplification systems are not fully optimized for radio-astronomy.
It is well known that any coherent amplifier has an unavoidable intrinsic noise dictated by quantum mechanics: the noise in any coherent amplifier cannot be less than the so called “quantum noise” which is of the order of hv/2k. Current best amplifiers are at the level of 5-10 times this noise. In this project, we attack the problem of further reducing the noise in High Electron Mobility Transistors, which are the base of a very popular transistor in use in radio astronomy, by researching the causes of the extra noise with the aim of approaching further the quantum noise limit.
The student will be integrated into a research laboratory where he/she will work with other students and staff and will be involved mostly in experimental physics.
The student will learn a set of experimental skills about the fundamentals of radio astronomy electronics, low temperature physics, vacuum techniques and quantum noise in non-linear systems.