4th Structured Probabilistic Inference & Generative Modeling
Abstract
Probabilistic approaches have been one of the core engines of machine learning for decades: they provide a language for uncertainty, latent structure, and decision-making under incomplete information or noisy observations. In parallel, generative modeling has long been an important branch of this toolkit from large language models to diffusion models. While their empirical success has largely been driven by scaling and benchmark-oriented engineering efforts, probabilistic principles have not faded into irrelevance; if anything, they have become increasingly vital for leveraging models in more complex tasks in the era of foundation models and real-world deployment. The mission of this workshop is to create a forum for research that is driven not solely by prevailing trends, but by well-reasoned scientific beliefs and long-term vision. We aim to bring together researchers working on structured probabilistic inference, generative modeling, and their intersections with modern foundation models. We particularly encourage contributions that explore emerging, unconventional, or underexplored directions that may shape the future of the field. By fostering dialogue across communities, including theoretical probabilistic modeling, generative modeling, information theory, and large-scale foundation model research, we hope to identify enduring principles, rediscover overlooked ideas, and inspire new frameworks that unify structure, scalability, and uncertainty. Ultimately, this workshop seeks to highlight that probabilistic thinking is not only foundational to the past and present of machine learning but also essential to its future trajectory.