Back To Index Previous Article Next Article Full Text

Statistica Sinica 33 (2023), 2405-2429

BAYESIAN PREDICTIVE INFERENCE
WITHOUT A PRIOR

Patrizia Berti, Emanuela Dreassi, Fabrizio Leisen, Luca Pratelli and Pietro Rigo

Università di Modena e Reggio-Emilia, Università di Firenze,
University of Nottingham, Accademia Navale di Livorno
and Università di Bologna

Abstract: Let (Xn : n ≥ 1) be a sequence of random observations. Let σn (·) = P (Xn+1 ∈ · | X1,…, Xn) be the nth predictive distribution and σ0 (·) = P (X1 ∈ ·) be the marginal distribution of X1. To make predictions on (Xn), a Bayesian forecaster needs only the collection σ = (σn : n ≥ 0). From the Ionescu-Tulcea theorem, σ can be assigned directly, without passing through the usual prior/posterior scheme. One main advantage is that no prior probability has to be selected. This point of view is adopted in this paper. The choice of σ is subject to only two requirements: (i) the resulting sequence (Xn) must be conditionally identically distributed and (ii) each σn+1 must be a simple recursive update of σn. Various new σ satisfying (i) and (ii) are introduced and investigated. For such σ, we determine the asymptotics of σn as n → ∞. In some cases, we also evaluate the probability distribution of (Xn).

Key words and phrases: Asymptotics, Bayesian nonparametrics, conditional identity in distribution, exchangeability, predictive distribution, sequential prediction, total variation distance.

Back To Index Previous Article Next Article Full Text