Exploring Reformulations of Neural Network Training

Dec05Fri

Exploring Reformulations of Neural Network Training

Fri, 05/12/2025 - 14:00 to 14:30

Location:

Speaker: 
Dr Xiaoyu Wang
Synopsis: 

Deep neural networks are most commonly trained via back-propagation and gradient-based optimisation. While this paradigm has been highly successful, it remains sensitive to gradient instabilities, non-smooth activations, and an inherently sequential structure that limits parallelisation. This talk presents recent advances that view neural network training from an alternative perspective. By reformulating learning as a higher-dimensional constrained optimisation problem, this approach improves the conditioning of the training process and opens new avenues for distributed updates. I will describe how this framework accommodates a range of architectures and algorithmic strategies, and demonstrate its applications to inverse problems.

Biography: 

Xiaoyu (Victor) Wang is currently a postdoctoral research associate in the Institute of Sensors, Signals and Systems at Heriot-Watt University, where he is part of the Bayesian Imaging, Sensing and Computation group led by Prof. Yoann Altmann. From 2023 to 2025, he was a postdoctoral researcher in the School of Mathematical and Computer Sciences, working with Prof. Audrey Repetti on unfolded neural networks and plug-and-play algorithms. He received his PhD in 2023 from the University of Cambridge. His research broadly focuses on developing and integrating ideas from applied mathematics, statistics, and machine learning to design scalable and reliable methods for imaging and signal recovery.

Institute: