903
Views
0
CrossRef citations to date
0
Altmetric
Bayesian and Monte Carlo Methods

Distilling Importance Sampling for Likelihood Free Inference

& ORCID Icon
Pages 1461-1471 | Received 12 Apr 2022, Accepted 27 Jan 2023, Published online: 15 Mar 2023

References

  • Agapiou, S., Papaspiliopoulos, O., Sanz-Alonso, D., and Stuart, A. M. (2017), “Importance Sampling: Intrinsic Dimension and Computational Cost,” Statistical Science, 32, 405–431.
  • Arbel, M., Matthews, A., and Doucet, A. (2021), “Annealed Flow Transport Monte Carlo,” in International Conference on Machine Learning, pp. 318–330.
  • Baydin, A. G., Pearlmutter, B. A., Radul, A. A., and Siskind, J. M. (2018), “Automatic Differentiation in Machine Learning: A Survey,” Journal of Machine Learning Research, 18, 1–43.
  • Baydin, A. G., Shao, L., Bhimji, W., Heinrich, L., Naderiparizi, S., Munk, A., Liu, J., Gram-Hansen, B., Louppe, G., Meadows, L., Torr, P., Lee, V., Prabhat, Cranmer, K., and Wood, F. (2019), “Efficient Probabilistic Inference in the Quest for Physics Beyond the Standard Model,” in Advances in Neural Information Processing Systems (Vol. 32).
  • Bornschein, J., and Bengio, Y. (2014), “Reweighted Wake-Sleep,” arXiv preprint arXiv:1406.2751.
  • Chatterjee, S., and Diaconis, P. (2018), “The Sample Size Required in Importance Sampling,” The Annals of Applied Probability, 28, 1099–1135.
  • Cornuet, J.-M., Marin, J.-M., Mira, A., and Robert, C. P. (2012), “Adaptive Multiple Importance Sampling,” Scandinavian Journal of Statistics, 39, 798–812.
  • Cotter, S. L., Kevrekidis, I. G., and Russell, P. (2020), “Transport Map Accelerated Adaptive Importance Sampling, and Application to Inverse Problems Arising from Multiscale Stochastic Reaction Networks,” SIAM/ASA Journal on Uncertainty Quantification, 8, 1383–1413.
  • Del Moral, P., Doucet, A., and Jasra, A. (2012), “An Adaptive Sequential Monte Carlo Method for Approximate Bayesian Computation,” Statistics and Computing, 22, 1009–1020.
  • Dieng, A. B., Tran, D., Ranganath, R., Paisley, J., and Blei, D. (2017), “Variational Inference via χ Upper Bound Minimization,” in Advances in Neural Information Processing Systems (Vol. 30).
  • Duan, L. L. (2021), “Transport Monte Carlo: High-Accuracy Posterior Approximation via Random Transport,” Journal of the American Statistical Association, DOI: 10.1080/01621459.2021.2003201.
  • Durkan, C., Bekasov, A., Murray, I., and Papamakarios, G. (2019), “Neural Spline Flows,” in Advances in Neural Information Processing Systems (Vol. 32).
  • Dutta, R., Mira, A., and Onnela, J.-P. (2018), “Bayesian Inference of Spreading Processes on Networks,” Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 474, 20180129.
  • Elvira, V., Martino, L., and Robert, C. P. (2022), “Rethinking the Effective Sample Size,” International Statistical Review, 90, 525–550.
  • Erdős, P., and Rényi, A. (1959), “On Random Graphs,” Publicationes Mathematicate, 6, 290–297.
  • Foerster, J., Farquhar, G., Al-Shedivat, M., Rocktäschel, T., Xing, E., and Whiteson, S. (2018), “Dice: The Infinitely Differentiable Monte Carlo Estimator,” in International Conference on Machine Learning, pp. 1529–1538.
  • Germain, M., Gregor, K., Murray, I., and Larochelle, H. (2015), “MADE: Masked Autoencoder for Distribution Estimation,” in International Conference on Machine Learning, pp. 881–889.
  • Graham, M. M., and Storkey, A. J. (2017), “Asymptotically Exact Inference in Differentiable Generative Models,” Electronic Journal of Statistics, 11, 5105–5164.
  • Grazian, C., and Fan, Y. (2019), “A Review of Approximate Bayesian Computation Methods via Density Estimation: Inference for Simulator-Models,” Wiley Interdisciplinary Reviews: Computational Statistics, 12, e1486.
  • Huggins, J. H., Kasprzak, M., Campbell, T., and Broderick, T. (2020), “Practical Posterior Error Bounds from Variational Objectives,” in Artificial Intelligence and Statistics, pp. 1792–1802.
  • Ikonomov, B., and Gutmann, M. U. (2020), “Robust Optimisation Monte Carlo,” in International Conference on Artificial Intelligence and Statistics, pp. 2819–2829.
  • Ionides, E. L. (2008), “Truncated Importance Sampling,” Journal of Computational and Graphical Statistics 17, 295–311.
  • Jerfel, G., Wang, S., Fannjiang, C., Heller, K. A., Ma, Y., and Jordan, M. I. (2021), “Variational Refinement for Importance Sampling Using the Forward Kullback-Leibler Divergence,” in Uncertainty in Artificial Intelligence, pp. 1819–1829.
  • Kingma, D. P., and Ba, J. (2015), “Adam: A Method for Stochastic Optimization,” in International Conference on Learning Representations.
  • Larrañaga, P., and Lozano, J. A. (2002), Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation, New York, NY: Springer.
  • Le, T. A., Baydin, A. G., and Wood, F. (2017), “Inference Compilation and Universal Probabilistic Programming,” in Artificial Intelligence and Statistics, pp. 1338–1348.
  • Li, Y., Turner, R. E., and Liu, Q. (2017), “Approximate Inference with Amortised MCMC,” arXiv preprint arXiv:1702.08343.
  • Liu, J. S. (1996), “Metropolized Independent Sampling with Comparisons to Rejection Sampling and Importance Sampling,” Statistics and Computing, 6, 113–119.
  • MacKay, D. J. C. (2003), Information Theory, Inference and Learning Algorithms, Cambridge, UK: Cambridge University Press.
  • Marin, J.-M., Pudlo, P., Robert, C. P., and Ryder, R. J. (2012), “Approximate Bayesian Computational Methods,” Statistics and Computing, 22, 1167–1180.
  • Meeds, T., and Welling, M. (2015), “Optimization Monte Carlo: Efficient and Embarrassingly Parallel Likelihood-Free Inference,” Advances in Neural Information Processing Systems (Vol. 28).
  • Mohamed, S., Rosca, M., Figurnov, M., and Mnih, A. (2020), “Monte Carlo Gradient Estimation in Machine Learning,” Journal of Machine Learning Research, 21, 1–62.
  • Müller, T., Mcwilliams, B., Rousselle, F., Gross, M., and Novák, J. (2019), “Neural Importance Sampling,” ACM Transactions on Graphics (TOG), 38, 1–19.
  • Naesseth, C. A., Lindsten, F., and Blei, D. (2021), “Markovian Score Climbing: Variational Inference with KL(p||q) ,” in Advances in Neural Information Processing Systems, Vol. 33, pp. 15499–15510.
  • Nash, C., and Durkan, C. (2019), “Autoregressive Energy Machines,” in International Conference on Machine Learning, pp. 1735–1744.
  • Papamakarios, G., Nalisnick, E., Rezende, D. J., Mohamed, S., and Lakshminarayanan, B. (2021), “Normalizing Flows for Probabilistic Modeling and Inference,” Journal of Machine Learning Research, 22,1–64.
  • Papamakarios, G., Sterratt, D., and Murray, I. (2019), “Sequential Neural Likelihood: Fast Likelihood-Free Inference with Autoregressive Flows,” in Artificial Intelligence and Statistics, pp. 837–848.
  • Paszke, A., Gross, S., Massa, F., Lerer, A., Bradbury, J., Chanan, G., et al. (2019), “Pytorch: An Imperative Style, High-Performance Deep Learning Library,” in Advances in Neural Information Processing Systems, Vol. 32.
  • Pickands III, J., and Stine, R. A. (1997), “Estimation for an M/G/∞ Queue with Incomplete Information,” Biometrika, 84, 295–308.
  • Prangle, D., Everitt, R. G., and Kypraios, T. (2018), “A Rare Event Approach to High-Dimensional Approximate Bayesian Computation,” Statistics and Computing, 28, 819–834.
  • Robert, C. P., and Casella, G. (2013), Monte Carlo Statistical Methods, New York, NY: Springer.
  • Rubinstein, R. (1999), “The Cross-Entropy Method for Combinatorial and Continuous Optimization,” Methodology and Computing in Applied Probability, 1, 127–190.
  • Rubinstein, R. Y., and Kroese, D. P. (2016), Simulation and the Monte Carlo Method, Hoboken, NJ: John Wiley & Sons.
  • Ruder, S. (2016), “An Overview of Gradient Descent Optimization Algorithms,” arXiv preprint arXiv:1609.04747.
  • Shestopaloff, A. Y., and Neal, R. M. (2014), “On Bayesian Inference for the M/G/1 Queue with Efficient MCMC Sampling,” arXiv preprint arXiv:1401.5548.
  • Smith, A. F. M., and Gelfand, A. E. (1992), “Bayesian Statistics without Tears: A Sampling–Resampling Perspective,” The American Statistician, 46, 84–88.
  • Wilkinson, R. D. (2013), “Approximate Bayesian Computation (ABC) Gives Exact Results UNDER the Assumption of Model Error,” Statistical Applications in Genetics and Molecular Biology, 12, 129–141. DOI: 10.1515/sagmb-2013-0010.
  • Yao, Y., Vehtari, A., Simpson, D., and Gelman, A. (2018), “Yes, but Did It Work?: Evaluating Variational Inference,” in International Conference on Machine Learning, pp. 5581–5590.