Richard Menzies

BSc(Hons)
MBCS
MIEEE

Papers CV CVAS

Biography

I am a PhD Student studying computer science at the University of Glasgow. My area of interest is Neural Topology Optimisation. This is a similar field to Neural Architecture Search (NAS), except that the focus is placed on optimising the topology of nodes within a neural network rather than a constrained search over convolutional or recurrent neural network architectures. Alternatively, this can be seen as optimising the topology of an MLP without the constraint of layers.
In my research, I explore new ideas relating to neural topology and novel methods of neural network training beyond standard gradient descent, evolutionary, and reinforcement learning methods; novel methods which could be applied in the context of Neural Topology Optimisation.
I currently sit within the CVAS research group in the department of Computer Science at the University of Glasgow, supervised by Dr. Paul Siebert and Dr. John Williamson.
In my spare time I enjoy playing chess and studying Metaphysics and Quantum Mechanics.

History

PhD. Computer Science ​ | ​ Neural Topology Optimisation
University of Glasgow
2022 - current
Research Internship
University of Glasgow
2022
First Class BSc(Hons) Computer Science
Awarded for Top-5 in year
University of Glasgow
2018 - 2022

Current Posts

Doctoral Consortium Chair
British Machine Vision Conference (BMVC)
2024
Student Member
University of Glasgow Stratigic Advisory Board
2024/2025
Convenor
Computer Vision and Autonomous Systems (CVAS) Group
2023 - 2024

Recent Papers

Soft Pruning and Latent Space Dimensionality Reduction
30/06/2024 | WCCI 2024 (IJCNN)
Neural Network pruning is commonly used to reduce the size of a neural network, reducing the memory footprint, while maintaining an acceptable loss. However, currently the only approach explored for removing a parameter from a neural network is to remove the parameter suddenly, irrespective of the pruning method, be it one-shot, iterative or sparsity-induced. We hypothesize that this sudden removal will cause the loss of the information contained within the removed parameters, information which could be useful when retraining the neural network after pruning. To resolve this, we propose Soft Pruning, a method of slowly decaying parameters out of a neural network. We compare this to one-shot pruning on the vision-based tasks of classification, autoencoding, and latent space dimensionality reduction. In every experiment, Soft Pruning is able to match or outperform one-shot pruning; in classification, Soft Pruning enables pruning to significantly greater extents than one-shot pruning, retaining over 60% accuracy where one-shot pruning becomes equivalent to random guessing. In autoencoding, Soft Pruning is able to achieve up to 17% lower loss after pruning. Finally, applied to latent space dimensionality reduction, Soft Pruning is shown to achieve more than 60% lower loss compared to one-shot pruning.

Tutoring

Data Fundumentals (H) 2024/2025
Network Systems (H) 2022/2023
Artificial Intelligence (H) 2022/2023

Volunteering

Treasurer
Glasgow Polytechnic Chess Club
2024 - Present
President
Glasgow University Table Tennis Club
2022
Founding Secretary
Glasgow University Table Tennis Club
2020 - 2022
Lead Organiser
7 Minutes of Science
2021
Secretary
Glasgow University Physics Society
2020 - 2022
Ordinary Committee Member
Glasgow University Physics Society
2019
Founding President
PMHS Chess Club
2017

Open Source Contributions

Inkscape Logo image/svg+xml Andy Fitzsimon Andrew Michael Fitzsimon Fitzsimon IT Consulting Pty Ltd http://andy.fitzsimon.com.au 2006 Inkscape Logo

Inkscape

Godot