The “International Society for Bayesian Analysis World Meeting 2024”, at Ca’ Foscari University of Venice, San Giobbe Economics Campus, on 1-7 July 2024, is organized by the Venice Centre for Economic and Risk Analytics (VERA) and promoted by the International Society for Bayesian Analysis (ISBA), a scientific society founded in 1992 by economist Arnold Zellner and statisticians Gordon Kaufman and Thomas Leonard.
The “27th International Conference on Information Fusion 2024” dedicated to Machine Learning methods and their applications will occur from 7 to 11 July 2024 on the San Giobbe Economics Campus. The conference is promoted by the International Society of Information Fusion (ISIF), a scientific society founded in 1998. It is organized by the Department of Environmental Sciences, Informatics and Statistics.
Given the interdisciplinarity of the two initiatives and to strengthen the scientific collaboration between the two communities, the opportunity was taken to promote a joint ISBA-Fusion 2024 Joint Workshop between the two events. It will be a meeting point between the two scientific communities of great international importance. Some topics of this half-day workshop will be Stochastic filtering, Anomaly/novelty discovery, Variational Bayes, and Networks for data fusion.
University of California, Berkeley
Abstract
Prediction-powered inference is a framework for performing valid statistical inference when an experimental dataset is supplemented with predictions from a machine-learning system. The framework yields simple algorithms for computing provably valid confidence intervals for quantities such as means, quantiles, and linear and logistic regression coefficients without making any assumptions about the machine-learning algorithm that supplies the predictions. Furthermore, more accurate predictions translate to smaller confidence intervals. Prediction-powered inference could enable researchers to draw valid and more data-efficient conclusions using machine learning. The benefits of prediction-powered inference were demonstrated with datasets from proteomics, astronomy, genomics, remote sensing, census analysis, and ecology.
ENSAE, Institut Polytechnique de Paris
Abstract
Given a smooth function f, we develop a general approach to turn Monte Carlo samples with expectation m into an unbiased estimate of f(m). Specifically, we develop estimators that are based on randomly truncating the Taylor series expansion of f and estimating the coefficients of the truncated series. We derive their properties and propose a strategy to set their tuning parameters -- which depend on m -- automatically, with a view to make the whole approach simple to use. We develop our methods for the specific functions f(x)=logx and f(x)=1/x, as they arise in several statistical applications such as maximum likelihood estimation of latent variable models and Bayesian inference for un-normalised models. Detailed numerical studies are performed for a range of applications to determine how competitive and reliable the proposed approach is.
(Joint work with Francesca Crucinio and Sumeetpal S. Singh)
University of British Columbia
Abstract
Over the past decade, black-box variational methods have revolutionized inference in probabilistic machine learning research. But variational methods haven't seen as widespread success in statistics, at least partially because they usually don't provide theoretically sound or reliable quantification of posterior uncertainty. So, is there a role for variational methods in statistics, and if so, what is it? In this talk, I will discuss the strengths and weaknesses of variational methods relevant specifically to statistical modelling and inference, a few common misconceptions, and some recent developments in the field that show how statisticians can leverage the strengths of variational methods in computation while retaining accurate, reliable approximations of posterior uncertainty.
University of California San Diego
Abstract
Estimation methods that make use of probabilistic graphical models are a promising approach to devising innovative algorithmic solutions. Over the last ten years, belief propagation performed on factor graphs combined with particle-based processing has become an established holistic framework and methodology for high-dimensional and nonlinear estimation. This approach has proven to be highly successful for source localization, multiobject tracking, and simultaneous localization and mapping. The resulting graph-based localization and tracking methods exhibit favorable properties in terms of detection and tracking performance, computational complexity, scalability, and versatility. This talk will introduce recent extensions of graph-based localization and tracking that embed neural networks or rely on particle flow. We will also discuss new multiobject tracking and SLAM methods that directly process raw sensor data and thus do not rely on any detection stage or data association scheme.
Syracuse University
Abstract
The field of data fusion came into being over forty years back. This requires fusion of information acquired from networked sensors. This talk will provide an overview of Bayesian methods applied to networked sensing and data fusion with emphasis on detection problems. Most past work has focused on fusion under the assumption of statistical independence between sensor observations. We will discuss the fusion of dependent information based on copula theory along with a few examples. Some avenues for future research will also be identified.