673
Views
1
CrossRef citations to date
0
Altmetric
Bayesian Methods

Improving the Accuracy of Marginal Approximations in Likelihood-Free Inference via Localization

ORCID Icon, &
Pages 101-111 | Received 21 Jul 2022, Accepted 28 May 2023, Published online: 20 Jul 2023
 

Abstract

Likelihood-free methods are an essential tool for performing inference for implicit models which can be simulated from, but for which the corresponding likelihood is intractable. However, common likelihood-free methods do not scale well to a large number of model parameters. A promising approach to high-dimensional likelihood-free inference involves estimating low-dimensional marginal posteriors by conditioning only on summary statistics believed to be informative for the low-dimensional component, and then combining the low-dimensional approximations in some way. In this article, we demonstrate that such low-dimensional approximations can be surprisingly poor in practice for seemingly intuitive summary statistic choices. We describe an idealized low-dimensional summary statistic that is, in principle, suitable for marginal estimation. However, a direct approximation of the idealized choice is difficult in practice. We thus suggest an alternative approach to marginal estimation which is easier to implement and automate. Given an initial choice of low-dimensional summary statistic that might only be informative about a marginal posterior location, the new method improves performance by first crudely localizing the posterior approximation using all the summary statistics to ensure global identifiability, followed by a second step that hones in on an accurate low-dimensional approximation using the low-dimensional summary statistic. We show that the posterior this approach targets can be represented as a logarithmic pool of posterior distributions based on the low-dimensional and full summary statistics, respectively. The good performance of our method is illustrated in low to moderate dimensional examples. Computer code to implement the methods for the examples of this article is available at https://github.com/cdrovandi/ABC-marginal-approximations.

Acknowledgments

The authors are grateful to three anonymous referees whose comments led to improvements in this article. CD is affiliated with the QUT Centre for Data Science. DJN is affiliated with the NUS Institute of Operations Research and Analytics, National University of Singapore.

Disclosure Statement

The authors report there are no competing interests to declare.

Funding

CD gratefully acknowledges support from the Australian Research Council Future Fellowship Award (FT210100260). DTF gratefully acknowledges support by the Australian Research Council through grant DE200101070.

Notes

1 Since most of the marginal posteriors do not have closed-form distribution, we normalize the densities using trapezoidal numerical integration for convenience, except for π(ϕ|s2) which has an inverse-gamma distribution.

2 In general, the weights for the logarithmic pool must be specified. This can be done in several ways, with the most common way being to obtain point estimates of the weights using data (Poole and Raftery 2000). Alternatively, in certain settings, a prior distribution over the weights can be specified and a posterior distribution for the weights obtained via Bayes’ theorem (see, e.g., Carvalho et al. 2022).

3 From this result we also see that, for n large, n{S3,xθ2}|θN(0,θ22) and the asymptotic distribution of the scaled and centered statistic only depends on θ2.