References
- Bradlow, E. T., and Zaslavsky, A. M. (1997), “Case Influence Analysis in Bayesian Inference,” Journal of Computational and Graphical Statistics, 6, 314–331. DOI: 10.2307/1390736.
- Breiman, L. (2001), “Random Forests,” Machine Learning, 45, 5–32. DOI: 10.1023/A:1010933404324.
- Chaloner, K., and Brant, R. (1988), “A Bayesian Approach to Outlier Detection and Residual Analysis,” Biometrika, 75, 651–659. DOI: 10.1093/biomet/75.4.651.
- Chipman, H., George, E., and McCulloch, R. (1998), “Bayesian CART Model Search,” Journal of the American Statistical Association, 93, 935–960. DOI: 10.1080/01621459.1998.10473750.
- Chipman, H., George, E., and McCulloch, R. (2010), “BART: Bayesian Additive Regression Trees,” The Annals of Applied Statistics, 4, 266–298.
- Cook, R. D., and Weisberg, S. (1982), Residuals and Influence in Regression, New York: Chapman and Hall.
- Denison, D., Mallick, B., and Smith, A. (1998), “A Bayesian CART Algorithm,” Biometrika, 85, 363–377.
- Gelfand, A. E., Dey, D. K., and Chang, H. (1992), “Model Determination Using Predictive Distributions with Implementation via Sampling-based Methods,” Technical Report, Stanford University CA Department of Statistics.
- Ghugare, S. B., Tiwary, S., Elangovan, V., and Tambe, S. S. (2014), “Prediction of Higher Heating Value of Solid Biomass Fuels Using Artificial Intelligence Formalisms,” BioEnergy Research, 7, 681–692. DOI: 10.1007/s12155-013-9393-5.
- Gkisser, S. (2017), Predictive Inference: An Introduction, Boca Raton, FL: Chapman and Hall/CRC.
- Gramacy, R. B., and Apley, D. W. (2015), “Local Gaussian Process Approximation for Large Computer Experiments,” Journal of Computational and Graphical Statistics, 24, 561–578. DOI: 10.1080/10618600.2014.914442.
- Hahn, R. P., Murray, J. S., and Carvalho, C. M. (2020), “Bayesian Regression Tree Models for Causal Inference: Regularization, Confounding, and Heterogeneous Effects” (with discussion), Bayesian Analysis, 15, 965–1056. DOI: 10.1214/19-BA1195.
- Hill, J. L. (2011), “Bayesian Nonparametric Modeling for Causal Inference,” Journal of Computational and Graphical Statistics, 20, 217–240. DOI: 10.1198/jcgs.2010.08162.
- Horiguchi, A., Pratola, M. T., and Santner, T. J. (2021), “Assessing Variable Activity for Bayesian Regression Trees,” Reliability Engineering & System Safety, 207, 107391. DOI: 10.1016/j.ress.2020.107391.
- Horiguchi, A., Santner, T. J., Sun, Y., and Pratola, M. T. (2022), “Using BART for Quantifying Uncertainties in Multiobjective Optimization of Noisy Objectives.” arXiv:2101.02558.
- Johnson, W., and Geisser, S. (1983), “A Predictive View of the Detection and Characterization of Influential Observations in Regression Analysis,” Journal of the American Statistical Association, 78, 137–144. DOI: 10.1080/01621459.1983.10477942.
- Linero, A. R. (2018), “Bayesian Regression Trees for High-Dimensional Prediction and Variable Selection,” Journal of the American Statistical Association, 113, 626–636. DOI: 10.1080/01621459.2016.1264957.
- Liu, H., Nattino, G., and Pratola, M. T. (2020), “Sparse Additive Gaussian Process Regression,” arxiv:1908.08864, pp. 1–33.
- MacKay, D. J. C. (1995), “Probable Networks and Plausible Predictions-A Review of Practical Bayesian Methods for Supervised Neural Networks,” Network: Computation in Neural Systems, 6, 469–505. DOI: 10.1088/0954-898X_6_3_011.
- Owen, A. B. (2013), Monte Carlo Theory, Methods and Examples. https://artowen.su.domains/mc/
- Pettit, L. (1990), “The Conditional Predictive Ordinate for the Normal Distribution,” Journal of the Royal Statistical Society, Series B, 52, 175–184. DOI: 10.1111/j.2517-6161.1990.tb01780.x.
- Picheny, V., Wagner, T., and Ginsbourger, D. (2013), “A Benchmark of Kriging-based Infill Criteria for Noisy Optimization,” Structural and Multidisciplinary Optimization, 48, 607–626. DOI: 10.1007/s00158-013-0919-4.
- Pratola, M., and Higdon, D. (2014), “Bayesian Regression Tree Calibration of Complex High-Dimensional Computer Models,” Technometrics, 58, 166–179. DOI: 10.1080/00401706.2015.1049749.
- Pratola, M. T. (2016), “Efficient Metropolis-Hastings Proposal Mechanisms for Bayesian Regression Tree Models,” Bayesian Analysis, 11, 885–911. DOI: 10.1214/16-BA999.
- Starling, J. E., Murray, J. S., Carvalho, C. M., Bukowski, R. K., and Scott, J. G. (2020), “BART with Targeted Smoothing: An Analysis of Patient-Specific Stillbirth Risk,” The Annals of Applied Statistics, 14, 28–50. DOI: 10.1214/19-AOAS1268.
- Tan, Y. V., and Roy, J. (2019), “Bayesian Additive Regression Trees and the General BART Model,” Statistics in Medicine, 38, 5048–5069. DOI: 10.1002/sim.8347.
- Weisberg, S. (2013), Applied Linear Regression, New York: Wiley.
- Zellner, A. (1975), “Bayesian Analysis of Regression Error Terms,” Journal of the American Statistical Association, 70, 138–144. DOI: 10.1080/01621459.1975.10480274.
- Zellner, A., and Moulton, B. R. (1985), “Bayesian Regression Diagnostics with Applications to International Consumption and Income Data,” Journal of Econometrics, 29, 187–211. DOI: 10.1016/0304-4076(85)90039-9.