I am currently an EPSRC Doctoral Prize Fellow at the University of Bristol. I specialise in machine learning techniques applied to astronomical data, with a focus on improving active learning performance and exploring the utility of weak supervision. My work has enabled me to be involved and consult in the morphology classification pipeline of the recently launched ESA telescope Euclid.

Current Research

I am based in the Data-Intensive Astronomical Analysis research group where I am working on:

  • Improving astronomical classification using Generative Models.
  • Parameter estimation of simulations using SBI.
  • Calibration of simulations within Digital Twins.
  • Creating interactive software for researchers to make use of cutting-edge machine learning techniques.

PhD Research

My PhD: Improving The Practicality of Active Learning Pipelines in Real-World Problem Settings: A Case Study in The Classification of Astronomical Data explored the following topics:

  • Provides a How-to guide for applying Active Learning to real-world data for experts from any scientific domain.
  • Creating novel query strategies to improve accuracy and reduce labelling costs for active learning.
  • Combining the use of weak supervision methods with active learning to improve performance on datasets where labels are scarce, noisy, or difficult to obtain.
  • Using active learning for galaxy morphology classification with noisy image data and unreliable labels.
  • Source classification (star, galaxy, AGN, QSO separation) using Active Learning and Outlier Detection methods.
  • Creating interactive software for researchers to make use of cutting-edge machine learning techniques.

Supervisory Team

Sotiria Fotopoulou , Oliver Ray and Malcolm Bremer

Recent Publications

  • Towards Foundational Models for Dynamical System Reconstruction: Hierarchical Meta-Learning via Mixture of Experts

    R. Desmond Nzoyem, G. Stevens, A. Sahota, D.AW Barton, T. Deakin

    ICLR 2025 - First Workshop on Scalable Optimization for Efficient and Adaptive Foundation Models
    PDF ARXIV
    As foundational models reshape scientific discovery, a bottleneck persists in dynamical system reconstruction (DSR): the ability to learn across system hierarchies. Many meta-learning approaches have been applied successfully to single systems, but falter when confronted with sparse, loosely related datasets requiring multiple hierarchies to be learned. Mixture of Experts (MoE) offers a natural paradigm to address these challenges. Despite their potential, we demonstrate that naive MoEs are inadequate for the nuanced demands of hierarchical DSR, largely due to their gradient descent-based gating update mechanism which leads to slow updates and conflicted routing during training. To overcome this limitation, we introduce MixER: Mixture of Expert Reconstructors, a novel sparse top-1 MoE layer employing a custom gating update algorithm based on K-means and least squares. Extensive experiments validate MixER's capabilities, demonstrating efficient training and scalability to systems of up to ten parametric ordinary differential equations. However, our layer underperforms state-of-the-art meta-learners in high-data regimes, particularly when each expert is constrained to process only a fraction of a dataset composed of highly related data points. Further analysis with synthetic and neuroscientific time series suggests that the quality of the contextual representations generated by MixER is closely linked to the presence of hierarchical structure in the data.
    @inproceedings{nzoyemmixer, title={Towards Foundational Models for Dynamical System Reconstruction: Hierarchical Meta-Learning via Mixture of Experts}, author={{Desmond Nzoyem}, R. and {Stevens}, G. and {Sahota}, A. and {Barton}, D.AW and {Deakin}, T.}, booktitle={First Workshop on Scalable Optimization for Efficient and Adaptive Foundation Models, ICLR 2025} }
  • Euclid Quick Data Release (Q1) Exploring galaxy properties with a multi-modal foundation model

    M. Siudek, M. Huertas-Company, M. Smith, G. Martinez-Solaeche, F. Lanusse, S. Ho, E. Angeloudi, P. A. C. Cunha, H. Domínguez Sánchez, M. Dunn, Y. Fu, P. Iglesias-Navarro, J. Junais, J. H. Knapen, B. Laloux, M. Mezcua, W. Roster, G. Stevens, J. Vega-Ferrero, Euclid Consortium

    Arxiv Preprint
    PDF DOI
    Modern astronomical surveys, such as the Euclid mission, produce high-dimensional, multi-modal data sets that include imaging and spectroscopic information for millions of galaxies. These data serve as an ideal benchmark for large, pre-trained multi-modal models, which can leverage vast amounts of unlabelled data. In this work, we present the first exploration of Euclid data with AstroPT, an autoregressive multi-modal foundation model trained on approximately 300000 optical and infrared Euclid images and spectral energy distributions (SEDs) from the first Euclid Quick Data Release. We compare self-supervised pre-training with baseline fully supervised training across several tasks: galaxy morphology classification; redshift estimation; similarity searches; and outlier detection. Our results show that: (a) AstroPT embeddings are highly informative, correlating with morphology and effectively isolating outliers; (b) including infrared data helps to isolate stars, but degrades the identification of edge-on galaxies, which are better captured by optical images; (c) simple fine-tuning of these embeddings for photometric redshift and stellar mass estimation outperforms a fully supervised approach, even when using only 1% of the training labels; and (d) incorporating SED data into AstroPT via a straightforward multi-modal token-chaining method improves photo-z predictions, and allow us to identify potentially more interesting anomalies (such as ringed or interacting galaxies) compared to a model pre-trained solely on imaging data.
    @misc{Siudek2025EuclidFoundation, author = {{Siudek}, M. and {Huertas-Company}, M. and {Smith}, M. and {Martinez-Solaeche}, G. and {Lanusse}, F. and {Ho}, S. and {Angeloudi}, E. and {Cunha}, P.~A.~C. and {Domínguez Sánchez}, H. and {Dunn}, M. and {Fu}, Y. and {Iglesias-Navarro}, P. and {Junais}, J. and {Knapen}, J.~H. and {Laloux}, B. and {Mezcua}, M. and {Roster}, W. and {Stevens}, G. and {Vega-Ferrero}, J. and the {Euclid Collaboration}.}, title = "{Euclid Quick Data Release (Q1) Exploring galaxy properties with a multi-modal foundation model}", year = {2025}, eprint = {2503.15312} }
  • Euclid Quick Data Release (Q1). The active galaxies of Euclid

    T. Matamoro Zatarain, S. Fotopoulou, F. Ricci, M. Bolzonella, F. La Franca, A. Viitanen, G. Zamorani, M. B. Taylor, M. Mezcua, B. Laloux, A. Bongiorno, K. Jahnke, G. Stevens, R. A. Shaw, L. Bisigello, W. Roster, Y. Fu, B. Margalef-Bentabol, A. La Marca, F. Tarsitano, A. Feltre, J. Calhau, X. Lopez Lopez, M. Scialpi, M. Salvato, V. Allevato, M. Siudek, C. Saulder, D. Vergani, M. N. Bremer, L. Wang, M. Giulietti, D.M. Alexander, D. Sluse, F. Shankar, L. Spinoglio, D. Scott, R. Shirley, H. Landt, M. Selwood, Y. Toba, P. Dayal, Euclid Collaboration

    Arxiv Preprint
    PDF DOI

    We present a catalogue of candidate active galactic nuclei (AGN) in the Euclid Quick Release (Q1) fields. For each Euclid source we collect multi-wavelength photometry and spectroscopy information from Galaxy Evolution Explorer (GALEX), Gaia, Dark Energy Survey (DES), Wise-field Infrared Survey Explorer (WISE), Spitzer, Dark Energy Survey (DESI), and Sloan Digital Sky Survey (SDSS), including spectroscopic redshift from public compilations. We investigate the AGN contents of the Q1 fields by applying selection criteria using Euclid colours and WISE-AllWISE cuts finding respectively 292,222 and 65,131 candidates. We also create a high-purity QSO catalogue based on Gaia DR3 information containing 1971 candidates. Furthermore, we utilise the collected spectroscopic information from DESI to perform broad-line and narrow-line AGN selections, leading to a total of 4392 AGN candidates in the Q1 field. We investigate and refine the Q1 probabilistic random forest QSO population, selecting a total of 180,666 candidates. Additionally, we perform SED fitting on a subset of sources with available zspec, and by utilizing the derived AGN fraction, we identify a total of 7766 AGN candidates. We discuss purity and completeness of the selections and define two new colour selection criteria (JH_IEY and IEH_gz) to improve on purity, finding 313,714 and 267,513 candidates respectively in the Q1 data. We find a total of 229,779 AGN candidates equivalent to an AGN surface density of 3641 deg-2 for 18<IE≤24.5, and a subsample of 30,422 candidates corresponding to an AGN surface density of 482 deg-2 when limiting the depth to 18<IE≤22. The surface density of AGN recovered from this work is in line with predictions based on the AGN X-ray luminosity functions.

    @misc{2025MatamoroZatarainActive, title = {Euclid Quick Data Release (Q1). The active galaxies of Euclid}, author = {{Matamoro Zatarain}, T. and {Fotopoulou}, S. and {Ricci}, F. and {Bolzonella}, M. and {La Franca}, F. and {Viitanen}, A. and {Zamorani}, G. and {Taylor}, M.B. and {Mezcua}, M. and {Laloux}, B. and {Bongiorno}, A. and {Jahnke}, K. and {Stevens}, G. and {Shaw}, R.~A. and {Bisigello}, L. and {Roster}, W. and {Fu}, Y. and {Margalef-Bentabol}, B. and {La Marca}, A. and {Tarsitano}, F. and {Feltre}, A. and {Calhau}, J. and {Lopez Lopez}, X. and {Scialpi}, M. and {Salvato}, M. and {Allevato}, V. and {Siudek}, M. and {Saulder}, C. and {Vergani}, D. and {Bremer}, M.~N. and {Wang}, L. and {Giulietti}, M. and {Alexander}, D.~M. and {Sluse}, D. and {Shankar}, F. and {Spinoglio}, L. and {Scott}, D. and {Shirley}, R. and {Landt}, H. and {Selwood}, M. and {Toba}, Y. and {Dayal}, P. and the {Euclid Collaboration}}, year = {2025}, eprint = {2503.15320} }
  • Recent Posts

    Recent Talks

    The Hidden Difficulties of Machine Learning

    Presented at Access To Bristol, 2021

    On-campus presentation to 30 local sixth form students who intend to study Engineering at university. This presentation immediately followed the AI & ML:Cutting Through The Hype talk and was used to show how ML tasks are often not as straightforward as they may seem. This talk is very interactive with the aim that the students are able to discover the problems that appear themselves and see why certain solutions may not be sufficient for a problem. Read more

    AI & ML: Cutting Through The Hype

    Presented at Sutton Trust Summer School, 2021

    Webinar presented to 60 sixth form students who intend to study Engineering at university. The presentation starts with an introduction to what Computer Science is (and is not) like at university. Following this, the (very brief) foundations of what Machine Learning and AI really are. Unfortunately, the adoption of these tools has led to a large amount of over-exaggeration and overuse of certain buzzwords throughout the industry, making it seem like companies are doing super complicated and ground-breaking things when most of the time they’re doing nothing more than the Maths the students use in their A-Level studies. I also show the Dot-Com Boom and the AI Winter as examples for how overhyping can be damaging for research progress and the economy. Read more