Luca Masserano

Hello! I am Luca, a PhD student in the Joint PhD Program in Statistics and Machine Learning at Carnegie Mellon University, where I am fortunate to be advised by Ann B. Lee and BarnabĂĄs PĂłczos.

I am broadly interested in statistics and machine learning, with a current focus on robust and reliable simulation-based inference. I am also curious (and keen to learn more) about uncertainty quantification, forecasting and optimization. At CMU, I am part of the Statistical Methods for the Physical Sciences (STAMPS) group. My research has also been supported by the National Science Foundation (grant #2020295).

I spent this summer as a Machine Learning Scientist Intern at AWS AI Labs in Berlin (Germany), working with Syama Sundar Rangapuram and Lorenzo Stella on integrating mixed-integer programs as differentiable blocks of end-to-end deep learning pipelines.

Before joining CMU, I obtained an M.Sc. in Data Science at Bocconi University in Milan (Italy), where I was advised by Igor Pruenster and Antonio Lijoi.

Email  /  CV  /  Scholar  /  Github: Personal - Group  /  LinkedIn

profile photo
News
  • 06/2023 - This summer I will again join the wonderful ML Forecasting team in Berlin (Germany), working on integrating combinatorial solvers into differentiable pipelines!

  • 01/2023 - The paper “Simulation-Based Inference with WALDO: Confidence Regions by Leveraging Prediction Algorithms and Posterior Estimators for Inverse Problems”, has been selected as a winner of the 2023 ASA Physical and Engineering Sciences Student Paper Competition!

  • 01/2023 - Excited to announce that the paper “Simulation-Based Inference with WALDO: Confidence Regions by Leveraging Prediction Algorithms or Posterior Estimators for Inverse Problems”, has been accepted for presentation at AISTATS 2023. Thanks to all my collaborators for the hard work!

  • 12/2022 - Looking forward to present two papers at NeurIPS 2022:
    • “Adaptive Sampling for Probabilistic Forecasting under Distribution Shift” at the “Distribution Shifts: Connecting Methods and Applications” workshop;
    • “Likelihood-Free Frequentist Inference for Calorimetric Muon Energy Measurement in High-Energy Physics” at the “Machine Learning for the Physical Sciences” workshop.
Publications and Preprints
End-to-end Learning of Mixed-Integer Programs via Stochastic Perturbations

Luca Masserano, Syama Sundar Rangapuram, Lorenzo Stella, Konstantinos Benidis, Ugo Rosolia, Michael Bohlke-Schneider
In preparation, 2023

We developed theory and methodology to embed arbitrary mixed-integer programs as differentiable blocks of deep learning pipelines via stochastic perturbations of the optimization inputs. We also proposed to exploit inluence functions to do sensitivity analysis on the combinatorial solvers and drive perturbations in the optimal direction.

Simulation-Based Inference with WALDO: Confidence Regions by Leveraging Prediction Algorithms and Posterior Estimators for Inverse Problems

Luca Masserano, Tommaso Dorigo, Rafael Izbicki, Mikael Kuusela, Ann B. Lee
AISTATS, 2023
Winner of the 2023 American Statistical Association SPES Student Paper Competition
[ paper / code / docs ]

WALDO allows to exploit arbitrary prediction algorithms and posterior estimators to construct reliable confidence sets for parameters of interest in simulation-based inference, i.e. when the likelihood is intractable but we can sample from it. Confidence sets from WALDO are guaranteed to be valid at the correct coverage level without being overly conservative. In addition, one can still exploit prior knowledge to achieve tighter constraints.

Adaptive Sampling for Probabilistic Forecasting under Distribution Shift

Luca Masserano, Syama Sundar Rangapuram, Shubham Kapoor, Rajbir Singh Nirwan, Youngsuk Park, Michael Bohlke-Schneider
NeurIPS Distribution Shifts Workshop (DistShift), 2022
[ paper ]

We present an adaptive sampling strategy that selects the part of the time series history that is relevant for forecasting. We achieve this by learning a discrete distribution over relevant time steps by Bayesian optimization. We instantiate this idea with a two-step method that is pre-trained with uniform sampling and then training a lightweight adaptive architecture with adaptive sampling.

Likelihood-Free Frequentist Inference: Confidence Sets with Correct Conditional Coverage

Niccolò Dalmasso*, Luca Masserano*, David Zhao, Rafael Izbicki, Ann B. Lee
Under review
[ paper / code / docs / supplementary material ], *equal contribution

In this work, we propose a unified and modular inference framework that bridges classical statistics and modern machine learning in SBI/LFI providing (i) a practical approach to the Neyman construction of confidence sets with frequentist finite-sample coverage for any value of the unknown parameters; and (ii) interpretable diagnostics that estimate the empirical coverage across the entire parameter space.

Experience
Amazon - AWS AI LABS
Machine Learning Scientist Intern
Manager: Lorenzo Stella, Mentor: Syama Sundar Rangapuram
June-August 2023, Berlin (Germany)

Developed theory and methodology to embed arbitrary mixed-integer programs as differentiable blocks of deep learning pipelines via stochastic perturbations of the optimization inputs. Proposed to exploit inluence functions to do sensitivity analysis on the combinatorial solvers and drive perturbations in the optimal direction.

Amazon - AWS AI LABS
Machine Learning Scientist Intern
Manager: Michael Bohlke-Schneider, Mentor: Syama Sundar Rangapuram
June-August 2022, Berlin (Germany)

Built a time series forecasting method that is robust under distribution shifts. I proposed a novel adaptive sampling approach and delivered an implementation that (i) avoids noisy data regions, (ii) focuses on relevant shifted region in the past, and (iii) has also promising first results with real-world datasets with known distribution shifts.

BlackRock - Financial Modeling Group (FMG)
Quantitative Analyst Intern
Manager: Joo Chew Ang
July-September 2019, London (UK)

Designed and developed a new research platform that allows to inspect the downstream effect of any modification in a suite of equity risk models. This platform streamlined the research process by reducing time between idea generation and implementation. I also worked with software engineers to refine compliance of production code with quantitative models' logic.

SmartFAB
Data Scientist Intern
Mentors: Carlo Baldassi, Carlo Lucibello
March-May 2019, Milan (Italy)

Exploited various statistical models to improve real-time detection of damaged integrated circuits produced in a semiconductor plant in southern Italy.


Created from Jonathan T. Barron's template