54
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Uncertainty in Artificial Neural Network Models: Monte-Carlo Simulations Beyond the GUM Boundaries

ORCID Icon &

References

  • Adamu, A., Maul, T., & Bargiela, A. (2013). On training neural networks with transfer function diversity. In International Conference on Computing Intelligence and Information Technology, CIIT 13, Mumbai, India. Elsiver.
  • Aksu, G., Guzeller, C., & Eser, M. T. (2019). The effect of the normalization method used in different sample sizes on the success of artificial neural network model. International Journal of Assessment Tools in Education, 6(2), 170–192. https://doi.org/10.21449/ijate.479404
  • Ashfahani, A., Pratama, M., Lughofer, E., & Ong, Y.-S. (2020). DEVDAN: Deep evolving denoising autoencoder. Neurocomputing, 390, 297–314. https://doi.org/10.1016/j.neucom.2019.07.106
  • Behler, J. (2021). Four generations of high-dimensional neural network potentials. Chemical Reviews, 121(6), 10037–10072. https://doi.org/10.1021/acs.chemrev.0c00868
  • Bishop, C. M. (1995). Neural networks for pattern recognition. Oxford Univesity Press. https://doi.org/10.1201/9781420050646.ptb6
  • Chattopadhyay, S., & Chattopadhyay, G. (2018). Conjugate gradient descent learned ANN for Indian summer monsoon rainfall and efficiency assessment through Shannon-Fano coding. Journal of Atmospheric and Solar-Terrestrial Physics, 179, 202–205. https://doi.org/10.1016/j.jastp.2018.07.015
  • Chen, G., Wang, H., Bezold, A., Broeckmann, C., Weichert, D., & Zhang, L. (2019). Strengths prediction of particulate reinforced metal matrix composites (PRMMCs) using direct method and artificial neural network. Composite Structures, 223, 110951. https://doi.org/10.1016/j.compstruct.2019.110951
  • Choong, C. C., Ibrahim, S., & El-Shafie, A. (2020). Artificial neural network (ANN) model development for predicting just suspension speed in solid-liquid mixing system. Flow Measurement and Instrumentation, 71, 101689. https://doi.org/10.1016/j.flowmeasinst.2019.101689
  • Chu, J., Liu, X., Zhang, Z., Zhang, Y., & He, M. (2021). A novel method overcoming overfitting of artificial neural network for accurate prediction: Application on thermophysical property of natural gas. Case Studies in Thermal Engineering, 28, 101406. https://doi.org/10.1016/j.csite.2021.101406
  • De Weerdt, J., & Weytjens, H. (2022). Learning uncertainty with artificial neural networks for predictive process monitoring. Applied Soft Computing, 125, 109134. https://doi.org/10.1016/j.asoc.2022.109134
  • Ebadi, M., Zabihifar, S. H., Bezyan, Y., & Koroteev, D. (2021). A nonlinear solver based on an adaptive neural network, introduction and application to porous media flow. Journal of Natural Gas Science & Engineering, 87, 103749. https://doi.org/10.1016/j.jngse.2020.103749
  • Foresee, F. D., & Hagan, M. T. (1997). Gauss-Newton approximation to Bayesian regularization. In Proceeding of the 1997 International Joint Conference on Neural Networks, Houston, TX.
  • Guo, L., Liu, D., Wu, Y., & Xu, G. (2023). Comparison of spiking neural networks with different topologies based on anti-disturbance ability under external noise. Neurocomputing, 529, 113–117. https://doi.org/10.1016/j.neucom.2023.01.085
  • Hajeb, M., Hamzeh S., Alavipanah S. K., Neissi L., & Verrelst J. (2023). Simultaneous retrieval of sugarcane variables from Sentinel-2 data using Bayesian regularized neural network. International Journal of Applied Earth Observation and Geoinformation, 116, 103168. https://doi.org/10.1016/j.jag.2022.103168
  • Heinrich, J., Been, K., Maya, G., & Melody, G. (2018). To trust or not trust a classifier. Advances in Neural Information Processing Systems, Montréal, Canada.
  • Honarnezhad, R., Fathinia, M., & Khataee, A. (2019). Mechanical production and sonocatalytic application of Cu2S nanoparticles for degradation of isopropylxanthic acid: Kinetic modeling via white and black box methods. Journal of Molecular Liquids, 287, 110899. https://doi.org/10.1016/j.molliq.2019.110899
  • Hullermeier, E., & Waegeman, W. (2022). Aleatoric and epistemic uncertainty in machine learning: an introduction to concepts and methods. Machine Learning, 110, 457–506.
  • JCGM-100. (2008). Evaluation of measurement data - Guide to the expression of uncertainty in measurement. JCGM.
  • JCGM-101. (2008). JCGM 101: Evaluation of measurement data – Supplement 1 to the “Guide to the expression of uncertainty in measurement” – Propagation of distributions using a Monte Carlo method. JCGM.
  • Kros, J. F., Lin, M., & Brown, M. L. (2006). Effects of the neural network s-sigmoid function on KDD in the presence of imprecise data. Computers & Operations Research, 33(11), 3136–3149. https://doi.org/10.1016/j.cor.2005.01.024
  • Lee, S.-H., Olevano, V., & Sklenard, B. (2023). A generalizable, uncertainty-aware neural network potential for GeSbTe with Monte Carlo droput. Solid-State Electronics, 108508, 199. https://doi.org/10.1016/j.sse.2022.108508
  • Lin, H., & Wang, J. C. (2019). Percolation of a random network by statistical physics method. International Journal of Modern Physics C, 30(2n03), 1950009. https://doi.org/10.1142/S0129183119500098
  • Li, Z., Ren, T., Xu, Y., & Jin, J. Y. (2018). The relationship between synchronization and percolation for regular networks. Physica, A 492, 375–381. https://doi.org/10.1016/j.physa.2017.10.003
  • MacKay, D. (1992). Bayesian interpolation. Neural Computation, 4(3), 415–447. https://doi.org/10.1162/neco.1992.4.3.415
  • Madhavan, S., & Kumar, N. (2021). Incremental methods in face recognition: A survey. Artificial Intelligence Review, 54(1), 253–303. https://doi.org/10.1007/s10462-019-09734-3
  • Miikkulainen, R. (2011). Topology of a neural network. In C. Sammut & G. I. Webb (Eds.), Encyclopedia of machine learning (pp. 988–989). Springer.
  • Nguyen, T., Ly, K.-D., Nguyen-Thoi, T., Nguyen, B.-P., & Doan, N.-P. (2022). Prediction of axial load bearing capacity of PHC nodular pile using Bayesian regularization artificial neural network. Solids and Foundations, 62(5), 101203. https://doi.org/10.1016/j.sandf.2022.101203
  • Panigrahy, S. K., Tseng, Y.-C., Lai, B.-R., & Chiang, K.-N. (2021). Overview of AI-Assisted design-on simulation technology for reliability life production of advanced packaging. Materials, 14(18), 5342. https://doi.org/10.3390/ma14185342
  • Paudel, A., Gupta, S., Thapa, M., Mulani, S. B., & Walters, R. W. (2022). Higher-order Taylor series expansion for uncertainty quantification with efficiency local sensitivity. Aerospace Science and Technology, 107574, 126. https://doi.org/10.1016/j.ast.2022.107574
  • Rubio, J. D.-J., Islas M. A., Ochoa G., Cruz D. R., Garcia E., & Pacheco J. (2022). Convergent newton method and neural network for the electric energy usage prediction. Information Sciences, 585, 89–112. https://doi.org/10.1016/j.ins.2021.11.038
  • Sadek, A. M. (2020). Simulating the response of ionization chamber system to 137Cs irradiator using the artificial neural network modeling algorithm. SN Applied Science, 2(8), 1325. https://doi.org/10.1007/s42452-020-3111-7
  • Samanta, S., Pratama, M., Sundaram, S., & Srikanth, N. (2020). Learning elastic memory online for fast time series forecasting. Neurocomputing, 390, 315–326. https://doi.org/10.1016/j.neucom.2019.07.105
  • Sega, M., Pennecchi, F., Rinaldi, S., & Rolle, F. (2016). Uncertainty evaluation for the quantification of low masses of benzo [a]pyrene: Comparison between the law of propagation of uncertainty and the Monte Carlo method. Analytical Chimica Acta, 920, 10–17. https://doi.org/10.1016/j.aca.2016.03.032
  • Sellat, Q., Bisoy, S. K., & Priyadrshini, R. (2022). Chapter 10- semantic segmentation for self-driving cars using deep learning: A survey. In Cognitive big data intelligence with metaheuristic approach (pp. 211–238). Academic Press. https://doi.org/10.1016/B978-0-323-85117-6.00002-9
  • Shi, S., Kuschmierz, R., Zhang, G., Lin, J., Czarske, J., & Qu, J. (2020). Modeling, quantification, and mitigation of uncertainty propagation in two-step roundness measurements. Measurements, 155, 107530. https://doi.org/10.1016/j.measurement.2020.107530
  • Sola, J., & Sevilla, J. (1997). Importance of input data normalization for the application of neural network to complex industrial problems. IEE Transactions on Nuclear Science, 44(3), 1464–1468. https://doi.org/10.1109/23.589532
  • Stoffel, M., Gulakala, R., Bamer, F., & Markert, B. (2020). Artificial neural networks in structural dynamics: A new modular radial basis function approach vs. convolutional and feedforward topologies. Computer Methods in Applied Mechanics and Engineering, 364, 112989. https://doi.org/10.1016/j.cma.2020.112989
  • Wang, X., Che, M., & Wei, Y. (2020). Neural network approach for solving nonsingular multi-linear tensor systems. Journal of Computational and Applied Mathematics, 368, 112569. https://doi.org/10.1016/j.cam.2019.112569
  • Widrow, N. D. (1990). Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights. In IJCNN International joint conference on neural networks, San Diego, CA. IEEE.
  • Wong, K., Dornberger, R., & Hanne, T. (2022). An analysis of weight initialization methods in connection with different activation functions for feedforward neural networks. Evolutionary Intelligence. https://doi.org/10.1007/s12065-022-00795-y
  • Xu, Y., & Goodacre, R. (2018). On splitting training and validation set: A comparative study of cross-validation, Bootstrap and systematic sampling for estimating the generalization performance of supervised learning. Journal of Analysis and Testing, 2(3), 249–262. https://doi.org/10.1007/s41664-018-0068-2
  • Yam, J., & Chow, T. (2000). A weighted initialization method for improving training speed in feedforward neural network. Neutocomputing, 30(1–4), 219–232. https://doi.org/10.1016/S0925-2312(99)00127-7
  • Yang, Z., Zhang, Q., & Chen, Z. (2019). Adaptive distribution convex optimization for multi-agent and its application in flocking behavior. Journal of the Franklin Institute, 356(2), 1038–1050. https://doi.org/10.1016/j.jfranklin.2018.05.004
  • Yilmaz, M., & Arslan, E. (2011). Effect of increasing number of neurons using artificial neural network to estimate geoid heights. International Journal of Physical Sciences, 6(3), 529–533.
  • Zhu, C. (2021). Chapter 2 - The basics of natural languane processing. In Machine reading comprehension (pp. 27–46). Elsevier. https://doi.org/10.1016/B978-0-323-90118-5.00002-3

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.