We formulate the world in optimization terms (i.e., objectives, constraints, and tradeoffs), then hand it to learning to make decisions. Too often, learning chases benchmarks while overlooking that formulation. I build optimization dynamics into learning, so structure and guarantees guide what models learn and how they behave.

I am Samar (pronounced as Summer or /ˈsʌmə(ɹ)/), a PhD candidate at the University of Pennsylvania. I work under the supervision of Prof. Alejandro Ribeiro.

I am on the academic job market.

__

News
Jan. 2026: Two new preprints are out: Unrolled neural networks for constrained optimization and A constrained optimization perspective of unrolled transformers.
Jan. 2026: Our paper, Stochastic unrolled neural networks, has been accepted to Conference on Parsimony and Learning (CPAL 2026).
Jan. 2026: Two papers, Unrolled graph neural networks for constrained optimization and Graph signal generative diffusion models, got accepted to ICASSP 2026.
Dec. 2025: Presented our papers Unrolled neural networks for constrained optimization and A constrained optimization perspective of unrolled transformers at NeurIPS workoshop on Constrained Optimization for Machine Learning.
Dec. 2025: Presented our paper GNN-parametrized diffusion policies for wireless resource allocation at NeurIPS workoshop on New Perspectives in Advancing Graph Machine Learning.
Sep. 2025: Three papers got accepted to NeurIPS workshops. Details to follow.
Sep. 2025: Our paper Generative diffusion models for resource allocation in wireless networks got accepted to IEEE Int. Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP). Oral.
Apr. 2025: Gave a talk about our work Robust stochastically-descending unrolled networks at ICASSP 2025.