About Me
Iโm a PhD student at the University of Zurich with Prof. Anastasia Koloskova, where I am working towards a scientific understanding of neural networks at the intersection of theory and engineering practice.
I recently completed my Master's in Electrical Engineering and Information Technology at ETH Zurich in the Machine Learning and Signal Processing track. I have extensively collaborated with the Data Analytics Lab of Prof. Hofmann, with whom I worked on Transformer Model Merging, and On Scale and Training Regimes in Continual Learning.
I come from Ticino, in the South of Switzerland. In my free time I enjoy cooking and baking, brewing specialty coffee, and running. I am also a huge fan of pro cycling and an avid listener of the LRCP cycling podcast.
News
- 09/2025: I have officially started my PhD at the Department of Mathematical Modeling and Machine Learning (DM3L) at the University of Zurich! Reach out if you want to collaborate on anything exciting ๐คฉ.
- 06/2025: I am excited to share our newest paper on The Importance of Being Lazy in Continual Learning, which I will present in Vancouver ๐จ๐ฆ for ICML in July!
- 06/2025: I am extremely happy to share that I will be joining the lab of Anastasia Koloskova in Zurich ๐จ๐ญ as a PhD student in September ๐คฉ๐!
- 03/2025: I have officially completed my Master of Science in EEIT from ETH Zurich ๐จ๐ผโ๐!
- 12/2024: I will be in Vancouver ๐จ๐ฆ for NeurIPS this week, presenting our newest Workshop paper!
- 05/2024: I am excited to be in Vienna ๐ฆ๐น for ICLR this week to present our work on Transformer Fusion!
Selected Publications
- The Importance of Being Lazy: Scaling Limits of Continual Learning
- Exploring the Limits of Feature Learning in Continual Learning
- Transformer Fusion with Optimal Transport