300
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Strength assessment of structural masonry walls: analysis based on machine learning approaches

, ORCID Icon, , ORCID Icon, &
Pages 505-524 | Received 03 Jan 2024, Accepted 19 Mar 2024, Published online: 09 Apr 2024

ABSTRACT

In conventional masonry buildings, masonry walls are key structural load-bearing elements. Likewise, masonry infill walls strengthen framed constructions against lateral stress. The material characteristics of brick units and mortar determine the compressive strength of structural masonry walls. In this study, advanced machine learning (ML) techniques were utilized to estimate the compressive strength of structural masonry walls based on the material properties of brick units and mortar. The Young’s modulus of brick units (Eu), compressive strength of brick units (Fcu), Young’s modulus of mortar (Em), and compressive strength of mortar (Fcm) were used as input parameters to model the compressive strength (Fc) of the structural masonry wall. Gradient Tree Boosting (GTB), Elman Neural Network (ENN), and Multivariate Adaptive Regression Splines (MARS) were developed using four diverse input and output (I/O) combinations to explore the effect of each input parameter on output estimation. The data used for modeling were obtained from prior studies published in the literature. The model’s performance was evaluated based on different statistical (error and efficiency) indices. For the third and fourth (I/O) combinations, the MARS model significantly outperformed other models. However, for the first I/O combination, the GTB model performed well. The study also revealed that the compressive strength of a structural masonry wall is more likely to depend on the strength and quality of the brick units than on the strength of the mortar.

Introduction

Masonry walls, made up of solitary block units that are laid out and mortared together, are one of the key structural elements of any building. The different block units that are often utilized in masonry walls are burnt clay bricks, stones, and solid or hollow concrete blocks. Compared to framed constructions, load-bearing masonry structures are economical and perform well for low-rise buildings [Citation1]. A masonry wall consists of two different materials, and the bond between them is usually weak enough to sustain lateral loads. As a result, masonry walls are appropriate where only the compressive load acts. Masonry infill walls provide significant lateral stiffness, ductility, and energy dissipation capacity in framed structures, allowing them to perform better against earthquake stresses [Citation2]. Masonry structures are used in virtually all types of buildings around the world due to their good thermal and acoustic insulation properties, low construction costs, and use of locally available materials. The factors affecting the strength of a masonry structure depend on the intrinsic strength and properties of brick units and the mortar. The compressive strength and Young’s modulus of brick units vary depending on the type of block, and that of mortar depends on the cement-to-fine aggregate ratio. Likewise, the compressive strength and Young’s modulus of brick units and mortar determine the overall compressive strength of any masonry construction [Citation3].

Previous research work related to masonry construction could indeed be classified into two categories. Firstly, the investigation of the mechanical properties of brick masonry walls or prism units (with different forms of assemblages); and secondly, the studies related to the in-plane shear behavior of masonry wall elements. Several researchers have carried out experimental and numerical investigations to investigate the behavior of masonry walls under compression load [Citation4–7]. Here, the behavior of masonry refers to its performance evaluation in terms of compressive strength, Young’s modulus, stress-strain pattern, and failure patterns. The brick-mortar bond strength is an important performance characteristic of masonry structures. Studies on full-scale walls or prism specimens as well as tests on masonry components like brick units and mortar are two ways for analyzing the performance of masonry construction as per codal provisions [Citation8]. The performance of masonry construction also depends on the height to thickness (slenderness) ratio and the application of eccentric loads. The strength capacity of masonry will reduce by increasing the slenderness ratio and eccentricity, particularly for brick masonry walls [Citation9,Citation10]. The masonry wall can give sufficient compressive strength even for the combination of strong and stiff bricks and weaker mortar of comparable strength and stiffness [Citation11]. Analyzing masonry walls under diagonal and shear compression loads is important because infill walls are susceptible to failure under diagonal compression [Citation12]. Autoclaved Aerated Concrete blocks have emerged as lightweight alternatives to traditional concrete blocks and meet the requirements for masonry units [Citation13]. The performance of masonry walls can be evaluated based on their load-carrying capacity, ductility, energy dissipation capacity, and stiffness degradation [Citation14]. Studies have shown that under dynamic loading, single-story masonry structures fail out-of-plane, while two-story structures fail in shear [Citation15]. Uniform and toothed confinement are two types of confinement used in masonry structures. Both exhibit similar failure mechanisms but differ in their ability to withstand lateral loads [Citation15]. A comprehensive literature review, analyzing and synthesizing sources, is presented in .

Table 1. Previous research on masonry wall testing and modeling.

Experimental investigations of masonry walls require well-equipped laboratories and skilled labor. Additionally, studying full-scale walls considering various factors can be time-consuming. To address these challenges, researchers have switched to finite element (FE) analysis using tools like ANSYS and ABACUS [Citation19–21]. Modeling masonry structures involves different approaches, such as block-based, continuum, macro-element, and geometry-based models. Choosing the right modeling technique is crucial, as each requires complete material properties of both units and mortar. FE modeling relies on data from experimental investigations, supplemented by some assumed properties often sourced from building codes. Due to these complexities, obtaining accurate results for compressive strength, failure patterns, and stress-strain behavior compared to experimental studies can be challenging [Citation22].

In addition to conventional analytical and numerical methods, soft computing or ML techniques have emerged as efficient prediction tools utilizing experimental data. These techniques are particularly useful for complex systems where analytical and numerical methods may not be feasible or accurate. This study applies three ML techniques – Gradient Tree Boosting (GTB), Elman Neural Networks (ENN), and Multivariate Adaptive Regression Splines (MARS) – to predict the compressive strength of masonry walls based on the compressive strength and Young’s modulus of masonry units and mortar. The adopted input-output combinations for model calibration are distinguished from each other, which provides a unique perspective on the prediction significance of the considered variables.

Methodology

Data analysis

The data for modeling the compressive strength of masonry walls was obtained from previous research work [Citation19]. The experiments, conducted by several researchers [Citation16,Citation17,Citation23,Citation24], studied the compressive strength of full-scale masonry walls using different block and mortar types. The analysis considered the following parameters: Young’s modulus of brick units (Eu), compressive strength of brick units (Fcu), Young’s modulus of mortar (Em) and compressive strength of mortar (Fcm). The entire dataset was divided into a 60% training set and a 40% testing set. The statistical parameters (maximum, minimum, coefficient of variation (CV), skewness, mean, and standard deviation) were calculated and tabulated in . shows the dependency of compressive strength of masonry walls (Fc) on Eu, Fcu, Em, and Fcm in terms of mutual information and F-test statistics. To investigate the effect of each input parameter on compressive strength, different input-output (I/O) combinations were considered. As shown in , four different I/O combinations were used to develop models and compare their effectiveness in predicting Fc.

Figure 1. Feature selection using mutual information and F-test statistics.

Figure 1. Feature selection using mutual information and F-test statistics.

Table 2. Descriptive statistics of masonry wall dataset.

Table 3. Input output (I/O) combinations arrived based on mutual information criteria.

Theoretical overview

Gradient Tree Boosting (GTB)

Gradient Tree Boosting is an ensemble machine learning algorithm that combines multiple ‘weak’ learners into a single ‘strong’ learner. These weak learners are typically the decision tree. The trees are trained on the pseudo-residuals (rmi) for each datapoint, which are calculated as the negative gradient of the loss function with respect to the predicted value.

(1) rmi=Lyi,fmxifmxi(1)

If, =xi,yii=1n: Dataset with n data points, xi being the feature vector and yi the target value; fmx is the prediction of the model at iteration m; and Ly,fx is the loss function measuring prediction error. The prediction of the new weak learner hmx is added to the existing model, weighted by the learning rate (αm) to prevent overfitting. A small learning rate leads to a more conservative update, while a large one can lead to faster convergence with the risk of overfitting. Regularization techniques are often used to prevent overfitting, such as limiting tree depth or number of leaves. Gradient descent helps to minimize the loss function by iteratively updating the model parameters. The boosting process iterates until a stopping criterion is met and thus the ensemble approach makes it less susceptible to outliers and noise in the data. Gradient tree boosting offers a compelling combination of accuracy, flexibility, and robustness, making it a valuable tool for various machine learning tasks. For further details related to GTB refer to the following literature [Citation25,Citation26].

Elman Neural Network (ENN)

ENN is the first successful recurrent network trained with backpropagation introduced by Jeffrey Elman, hence its so-called Elman Neural Network [Citation25]. The benefit of ENN is reducing the learning time of data patterns and increasing the accuracy of the predicted results. Elman networks are more powerful than basic feedforward networks in modeling. ENN introduces a crucial recurrent connection within the hidden layer. This loop creates a context layer that remembers past activations, allowing the network to learn and utilize temporal dependencies. Recurrent connection is the defining feature of Elman networks. Context layer stores the activations from the previous hidden layer, serving as a form of internal memory. The learning rate controls the step size used to update the weights during backpropagation, impacting the speed and convergence of training. The training algorithm allows the network to learn from sequential data by propagating errors backward through time and adjusting the weights accordingly. Optimizing parameters like learning rate, number of neurons, and activation functions are crucial for achieving optimal performance. For further details related to ENN, refer to the following literature [Citation27,Citation28]

Multivariate Adaptive Regression Splines (MARS)

Multivariate Adaptive Regression Splines (MARS) is a non-linear regression algorithm that can establish simple piecewise linear functions that best predict complex, multivariate non-linear relationships. It identifies a set of simple piecewise linear functions that categorize the data and make predictions. MARS builds its model using these piecewise linear functions created by splitting a feature’s range at knots. MARS automatically selects these knots, making the model adaptive and flexible. The algorithm employs a generalized cross-validation (GCV) criterion for selecting the best model complexity and avoiding overfitting. In MARS, the forward and pruning stages are the two crucial steps involved in building the final model. During the forward stage, for each feature, MARS evaluates all possible knots. The choice of knots and basis functions is based on their impact on the model’s error reduction. Once the forward selection ends, the model has the maximum number of basis functions. For each basis function, its contribution to the model’s fit is assessed using a shrinkage penalty. The function with the smallest contribution is removed from the model during the pruning stage. Target prediction is made by summing the weighted output of all of the basis functions in the model. For further details related to MARS, refer to the following literature [Citation29,Citation30].

Model development

The compressive strength of masonry walls was predicted using four input parameters, such as Young’s modulus of brick units (Eu), compressive strength of brick units (Fcu), Young’s modulus of mortar (Em), and compressive strength of Mortar (Fcm). The whole data set was divided into approximately 60% training and approximately 40% testing data sets. Machine learning models like GTB, ENN, and MARS were developed by optimizing their parameters. The effectiveness of the model architecture and performance of these models depend on the model input-output (I/O) combinations listed in . The details of model development and selection of optimal parameters for predicting the compressive strength of masonry wall are tabulated in . The adopted models were developed using four different I/O combinations, and their performance is analyzed using statistical metrics. The methodology flowchart is presented in .

Figure 2. Methodology flowchart.

Figure 2. Methodology flowchart.

Table 4. Model parameters optimized by trial-and-error approach.

Performance evaluation

The predicted compressive strength from the models is compared with actual experimental values. The model prediction efficiency and error rates are evaluated using the following statistical indices: relative root mean square error (RRMSE), normalized Nash – Sutcliffe efficiency (NNSE), Willmott index (WI), mean absolute error (MAE) and Kling Gupta efficiency (KGE).

Relative Root Mean Square Error,

(2) RRMSE=RMSEσobs0RRMSE1(2)

where, Root Mean Square Error,

(3) RMSE=i=1NXiYi2N(3)

Normalized Nash – Sutcliffe efficiency,

(4) NNSE=12NSE0NNSE1(4)

where Nash – Sutcliffe efficiency,

(5) NSE=1i=1NXiYi2i=1NXiXˉ2(5)

Willmott index (WI),

(6) WI=1i=1NXiYiji=1NYiXˉ+XiXˉj0WI1(6)

Mean Absolute Error (MAE),

(7) MAE=1Ni=1NXiYi(7)

Kling Gupta efficiency,

(8) KGE=1r12+β12+γ12KGE1(8)

σobs= Standard deviation of observed data

X- Observed/Measured values, Y- Predicted values,

N- Number of data values.

Xˉ- Mean of actual data and Yˉ- Mean of predicted data, j – exponent term

Bias ratio, β=YˉXˉ

Variability, γ=CVYCVX=σYYˉσXXˉ where, CV is co-efficient of variation

r = Pearson’s linear correlation coefficient.

Results and discussion

This study employed machine learning techniques like gradient boosting trees (GTB), Elman neural networks (ENN), and multivariate adaptive regression splines (MARS), to predict the compressive strength (Fc) of masonry walls. The input parameters Eu, Fcu, Em, and Fcm were used to predict Fc as the output parameter. The results obtained from the models for different input and output combinations are presented and discussed in detail. Violin and swarm plots, as presented in , offer a greater ability to visualize data density and distribution compared to box plots, especially when data have one or more peaks. A detailed comparison of actual Fc with GTB, ENN, and MARS model predictions is presented using these plots. presents scatter plots showing the coefficient of determination (R2) between the actual Fc (experimental) and the predicted Fc obtained from the GTB, ENN, and MARS techniques for different input and output parameter combinations. Finally, the Taylor diagrams shown in provide a graphical representation of GTB, ENN, and MARS predictions in relation to actual Fc values, with the aid of statistical indices (standard deviation, root mean square deviation and correlation coefficient).

Figure 3. Violin and swarm plots for comparative performance evaluation of models.

Figure 3. Violin and swarm plots for comparative performance evaluation of models.

Figure 4. Scatter plots of actual v/s predicted compressive strength (test phase data).

Figure 4. Scatter plots of actual v/s predicted compressive strength (test phase data).

Figure 5. Taylor diagrams to graphically indicate the performance of each model in terms of RMSD, correlation coefficient and Standard deviation statistics (test phase data).

Figure 5. Taylor diagrams to graphically indicate the performance of each model in terms of RMSD, correlation coefficient and Standard deviation statistics (test phase data).

Performance of models: first I/O combination

In the first I/O combination, where input parameters Eu, Fcu, and Fcm were used for training and testing, the GTB model outperformed ENN and MARS models in terms of efficiency and error reduction, particularly with a WI value of 0.9877 (test phase) as shown in . The GTB technique achieved significantly lower MAE and RMSE values both during training and testing, despite a larger difference in these metrics during the training phase. Importantly, the testing-phase performance exhibited a smaller difference with MAE of 2.4823 (MPa) and RMSE of 2.9524 (MPa), indicating fewer and smaller errors in the GTB predictions compared to other models. Moreover, the GTB technique demonstrated superior accuracy across RRMSE, NSE, and KGE metrics.

Table 5. Performance evaluation metrics of the models developed.

Further analysis through swarm and violin plots () revealed that while the median Fc values for all models hovered around 10 MPa, the GTB model exhibited a closer affinity to the actual Fc data. Additionally, scatter plot () confirmed the excellent performance of GTB and MARS models, with correlation coefficients (R2) of 0.9555 and 0.9045, respectively. Notably, in the Taylor diagram (), the MARS model precisely overlapped with the red line (actual Fc data) in the first I/O combination, highlighting its exceptional predictive accuracy. However, the GTB model demonstrated a superior correlation with the observed values, while the ENN model showed a significant divergence from the actual data.

Performance of models: second I/O combination

For the second I/O combination, the input parameters Eu and Fcu were used to train the model to find Fc. During training, the ENN technique underperformed compared to the GTB and MARS models. However, the ENN model performed very well in testing with a very good accuracy (NNSE = 0.9096 and WI = 0.9681), whereas GTB gave lower accuracy in testing phase with WI = 0.896 as shown in .

The violin and swarm plot () for the second I/O combination show that both GTB and MARS provided underestimated predictions, while ENN showed dual peaks almost similar to the actual Fc plot, but with limits within 10–30 MPa. Even the scatter plots () showed that the performance of the ENN model was very good with correlation coefficient (R2) = 0.9006, whereas GTB and MARS did not fit well, with many distant scattering values and R2 less than 0.66. The Taylor diagram () showed that the GTB and MARS models were closer to the red line, indicating their accuracy in prediction, while ENN gave a higher correlation with the observed values.

Performance of models: third I/O combination

In the third I/O combination, the input parameters Eu and Fcm were used to train and predict Fc using various ML techniques. GTB performed well during training. However, in the testing phase, the MARS model with a WI of 0.9725 outperformed both GTB and ENN. Most evaluation metrics confirmed the superior performance of MARS. ENN underperformed significantly for this combination in both training and testing. The violin and swarm plots () of GTB and MARS revealed their similarity in data distribution, with slight variations for Fc values exceeding 35 MPa. In contrast, the ENN model exhibited a wider concentration of data with a leaner distribution plot. The scatter plots depicted in corroborated these findings, with the R2 value for the ENN model being less than 0.7. The Taylor diagram further illustrated that the GTB and MARS models possessed low standard deviations, with data clusters concentrated around the mean value, while the predicted data from the ENN model exhibited high standard deviation.

Performance of models: fourth I/O combination

In the fourth I/O combination, where the input parameters were Fcu and Fcm for training to predict Fc, the ENN underperformed compared to the GTB and MARS models during training. While the analysis indicated strong training performance for GTB, MARS outperformed all models in testing with a WI of 0.9911. This suggests lower errors and robust accuracy for MARS across most evaluation metrics. Swarm and violin plots () revealed a comparable distribution of predicted data by all models, with slightly higher clustering of predicted data by the ENN model. The MARS model generated predictions of Fc values closely resembling the actual Fc values. Scatter plots () confirmed excellent performance for both GTB and MARS, with R2 of 0.9572 and 0.9662, respectively. The fourth I/O combination not only achieved a high degree of correlation with observed data but also exhibited the lowest deviation across all models. All ML models produced the least error indices in this I/O combination. The Taylor diagram () further confirms the models’ affinity toward the red line, indicating superior prediction of Fc data around observed values and a high coefficient of correlation around 0.97.

Conclusions

This study investigated the use of advanced machine learning techniques to estimate the compressive strength of structural masonry walls. Three algorithms were employed: Gradient Boosting Trees (GTB), Elman Neural Networks (ENN), and Multivariate Adaptive Regression Splines (MARS). Data sourced from various literature sources included Young’s modulus of brick units (Eu), compressive strength of brick units (Fcu), Young’s modulus of mortar (Em), and compressive strength of mortar (Fcm). Different statistical indices, encompassing both error and efficiency measures, were computed to assess the effectiveness of each machine learning technique. By analyzing these indices, we aimed to identify the most suitable method for this specific application.

  • Overall, Multivariate Adaptive Regression Splines (MARS) emerged as the most robust and accurate model across various input/output (I/O) combinations, consistently demonstrating superior performance in testing phase. MARS consistently achieved higher WI values, indicating better fitting to unseen data compared to Gradient Tree Boosting (GTB) and Elman Neural Networks (ENN).

  • Swarm and violin plots revealed MARS predictions closely resembling the actual Fc data, further confirmed by higher coefficient of determination (R2) in scatter plots.

  • Particularly in the fourth I/O combination, MARS achieved a remarkable WI of 0.9911, showcasing its ability to generalize well to unseen data. While GTB sometimes displayed competitive performance during training, its testing-phase results were often less consistent compared to MARS. ENN generally underperformed across all I/O combinations, indicating its limitations in this specific application.

Therefore, this study highlights MARS as the most effective machine learning technique for predicting Fc in masonry walls, demonstrating its capability to provide accurate and reliable predictions across various input scenarios.

Declaration of generative AI and AI-Assisted technologies in the writing process

‘During the preparation of this work, the authors used Google Gemini in order to draft some sentences in the introduction and method sections. After using this tool/service, the authors reviewed and edited the content as needed and take full responsibility for the content of the publication’.

List of abbreviations

Eu=

Young’s modulus of brick units

Fcu=

Compressive strength of brick units

Em=

Young’s modulus of mortar

Fcm=

Compressive strength of mortar

Fc=

Compressive strength of the structural masonry wall

GTB=

Gradient Tree Boosting

ENN=

Elman Neural Network

MARS=

Multivariate Adaptive Regression Splines

I/O=

Input and output

RRMSE=

Relative root mean square error

RMSE=

Root mean square error

NNSE=

Normalized Nash – Sutcliffe efficiency

NSE=

Nash – Sutcliffe efficiency

MAE=

Mean Absolute Error

KGE=

Kling-Gupta Efficiency

WI=

Willmott index

R2=

Co-efficient of Determination

ML=

Machine Learning

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

This manuscript has no associated data. Data used for model development can be shared upon reasonable request.

References

  • Hendry EAW. Masonry walls: materials and construction. Constr Build Mater. 2001;15(8):323–330. doi: 10.1016/S0950-0618(01)00019-8
  • Murty CVR, Jain SK, Beneficial influence of masonry infill walls on seismic performance of RC frame buildings, in: 12th World Conference on Earthquake Engineering, Auckland, New Zealand, 2000: pp. 1–6. [cited 2022 Apr 30]. https://www.iitk.ac.in/nicee/wcee/article/1790.pdf
  • Sarangapani G, Venkatarama Reddy BV, Jagadish KS. Brick-mortar bond and masonry compressive strength. J Mater Civ Eng. 2005;17(2):229–237. doi: 10.1061/(ASCE)0899-1561(2005)17:2(229)
  • Franzoni E, Gentilini C, Graziani G, et al. Compressive behaviour of brick masonry triplets in wet and dry conditions. Constr Build Mater. 2015;82:45–52. doi: 10.1016/j.conbuildmat.2015.02.052
  • Celano T, Argiento LU, Ceroni F, et al. In-plane behaviour of masonry walls: numerical analysis and design formulations. Materials. 2021;14(19):5780. doi: 10.3390/ma14195780
  • Abdulla KF, Cunningham LS, Gillie M. Simulating masonry wall behaviour using a simplified micro-model approach. Eng Struct. 2017;151:349–365. doi: 10.1016/j.engstruct.2017.08.021
  • Mohammed A, Hughes TG, Mustapha A. The effect of scale on the structural behaviour of masonry under compression. Constr Build Mater. 2011;25(1):303–307. doi: 10.1016/j.conbuildmat.2010.06.025
  • ACI 530/530.1-13. Building code requirements and specification for masonry structures. American Concrete Institute; 2013. p. 319.
  • Keshava M, Raghunath SR. Experimental investigations on axially and eccentrically loaded masonry walls. J Inst Eng India Ser A. 2017;98(4):449–459. doi: 10.1007/s40030-017-0222-2
  • Perez Gavilan JJ, Flores LE, Alcocer SM. An experimental study of confined masonry walls with varying aspect ratios. Earthq Spectra. 2015;31(2):945–968. doi: 10.1193/090712EQS284M
  • Kaushik HB, Rai DC, Jain SK. Stress-strain characteristics of clay brick masonry under uniaxial compression. J Mater Civ Eng. 2007;19(9):728–739. doi: 10.1061/(ASCE)0899-1561(2007)19:9(728)
  • Corradi M, Borri A, Vignoli A. Experimental study on the determination of strength of masonry walls. Constr Build Mater. 2003;17(5):325–337. doi: 10.1016/S0950-0618(03)00007-2
  • Doddamani D, Keshava M. AAC block masonry with ready mix mortar—an experimental and numerical analysis. In: Rao A Ramanjaneyulu K, editors. Recent advances in structural engineering. Lecture Notes in Civil Engineering, Volume 1, Vol. 11. Singapore: Springer; 2019. p. 681–692. doi: 10.1007/978-981-13-0362-3_55
  • Reboul N, Mesticou Z, Si Larbi A, et al. Experimental study of the in-plane cyclic behaviour of masonry walls strengthened by composite materials. Constr Build Mater. 2018;164:70–83. doi: 10.1016/j.conbuildmat.2017.12.215
  • Belghiat C, Messabhia A, Plassiard J-P, et al. Experimental study of double-panel confined masonry walls under lateral loading. J Buil Eng. 2018;20:531–543. doi: 10.1016/j.jobe.2018.09.001
  • James J, Pandian PK, Deepika K, et al. Cement stabilized soil blocks admixed with Sugarcane Bagasse Ash. JF Eng. 2016;2016:1–9. doi: 10.1155/2016/7940239
  • Sajanthan K, Balagasan B, Sathiparan N. Prediction of compressive strength of stabilized earth block masonry. Adv Civil Eng. 2019;2019:1–13. doi: 10.1155/2019/2072430
  • Lulić L, Lukačević I, Skejić D, et al. Assessment of existing masonry resistance using partial factors approaches and field measurements. Buildings. 2023;13(11):2790. doi: 10.3390/buildings13112790
  • Drougkas A, Roca P, Molins C. Numerical prediction of the behavior, strength and elasticity of masonry in compression. Eng Struct. 2015;90:15–28. doi: 10.1016/j.engstruct.2015.02.011
  • Prakash PR, Azenha M, Pereira JM, et al. Finite element based micro modelling of masonry walls subjected to fire exposure: framework validation and structural implications. Eng Struct. 2020;213:110545. doi: 10.1016/j.engstruct.2020.110545
  • Srinivas V, Sasmal S. Experimental and numerical studies on ultimate load behaviour of brick masonry. J Inst Eng India Ser A. 2016;97(2):93–104. doi: 10.1007/s40030-016-0152-4
  • D’Altri AM, Sarhosis V, Milani G, et al. Modeling strategies for the computational analysis of unreinforced masonry structures: review and classification. Arch Computat Methods Eng. 2020;27(4):1153–1185. doi: 10.1007/s11831-019-09351-x
  • Keshava M, Behaviour of masonry under axial, eccentric and lateral loading [ Ph.D. Thesis]. VTU, Belagavi; 2012.
  • Pekmezci BY, Polat Pekmezci I. Development of shear strength index test probe: its application on historic structures. Int J Build Pathol Adapt. 2022;40(5):693–711. doi: 10.1108/IJBPA-10-2020-0089
  • Friedman JH. Stochastic gradient boosting. Comput Stat Data Anal. 2002;38(4):367–378. doi: 10.1016/S0167-9473(01)00065-2
  • Natekin A, Knoll A. Gradient boosting machines, a tutorial. Front Neurorobot. 2013;7:21. doi: 10.3389/fnbot.2013.00021
  • Elman JL. Finding structure in time. Cognit Sci. 1990;14(2):179–211. doi: 10.1207/s15516709cog1402_1
  • Guanghua R, Cao Y, Wen S, et al. A modified Elman neural network with a new learning rate scheme. Neurocomputing. 2018;286:11–18. doi: 10.1016/j.neucom.2018.01.046
  • Friedman JH. Multivariate adaptive regression splines. Ann Stat. 1991;19(1):1–67. doi: 10.1214/aos/1176347963
  • Hastie T, Tibshirani R, Friedman J. 2009. The elements of statistical learning, Springer series in statistics. second ed. NY: Springer. doi: 10.1007/978-0-387-84858-7