ResearchI'm currently exploring the abstraction of algebraic effects, extensible data, and row polymorphism in probabilistic programming language design; the intention is to encode probabilistic models as first-class citizens (hence being modular and composable) whilst being simultaneously general-purpose (i.e. suitable for all forms of simulation and inference). I'm also investigating how complex inference algorithms can be modularly implemented using effect handlers as program transformations on models.
My previous work explored a structured (categorical) approach of using recursion schemes to implement neural networks, letting compositionality be promoted in new ways. In particular, I show how neural networks can be represented as fixed-points of recursive data structures, and forward and back propagation as catamorphisms (folds) and anamorphisms (unfolds) over these structures.
Jan, 2022 Linked visualisations via Galois dependencies
R.Perera, M.Nguyen, T.Petricek, M.Weng
Aug, 2021 Composable, Modular Probabilistic Models [Poster]
M.Nguyen, R.Perera, M.Weng
ICFP ‘21, Student Research Competition
Jun, 2019 Modelling Neural Networks with Recursion Schemes [Poster]
Masters Dissertation, University of Bristol
2021 Composable, Modular Probabilistic Models
IFL ‘21, ICFP ‘21 (Poster)
Teaching2019 - 2022
I give talks/seminars to the Programming Languages Research Group and undergraduates at the University of Bristol.
2020 - 2021
I've acted as the main supervisor for a 4th year student in their masters dissertation.
2017 - 2020
I've acted as a teaching assistant for the Functional Programming, Language Engineering, and Advanced Topics in Programming Languages units at the University of Bristol.
Awards2021 ACM Student Research Competition, 1st Place (ICFP '21)
2019 Bloomberg Award - Best Machine Learning Paper, University of Bristol
2018 Graphcore Award - Best Group Project, University of Bristol
2017 Netcraft Award - Top Ten Achieving CS Students, University of Bristol