211
Views
0
CrossRef citations to date
0
Altmetric
Computer Science

New intelligent particle swarm optimization algorithm with extreme learning machine for forecasting Pattavia pineapple productivity: case study of Loei and Nong Khai provinces in Thailand

Article: 2316458 | Received 05 Oct 2023, Accepted 05 Feb 2024, Published online: 18 Feb 2024

Abstract

This research designs and develops a software innovation for Pattavia pineapple cultivation and productivity distribution planning to increase income for farmers. This research formulated and introduced an innovative machine learning (ML) model, the new model is called a new intelligent particle swarm optimization algorithm with extreme learning machine (NIPSO-ELM), to forecast Pattavia pineapple productivity with a notable degree of precision and dependability. In this work, the artificial neural network (ANN) and the standard ELM was built, and assessed for its ability to forecast the productivity of Pattavia pineapples. The findings indicate that the ELM neural network is an innovative model characterized by its straightforward architecture and exceptional performance. Moreover, the utilization of particle swarm optimization (PSO), ant colony optimization (ACO) and the NIPSO algorithms significantly enhanced ELM performance when forecasting the productivity of Pattavia pineapples. The NIPSO-ELM model emerged as the most optimal ML model for accurately, reliably and stably forecasting the productivity of Pattavia pineapples in practical scenarios. The most optimal NIPSO-ELM models for the Loei provinces, Thailand exhibit the following performance metrics: RMSE = 304.36389, MAE = 243.29531, MAPE = 0.03753 and MASE = 0.93157. The most optimal NIPSO-ELM models for the Nong Khai provinces, Thailand exhibit: RMSE = 304.57352, MAE = 244.67834, MAPE = 0.03756 and MASE = 0.93296, respectively.

1. Introduction

Pineapple in Thailand is considered an important economic crop. Pineapple generates an income for the country of approximately 23,000–25,000 million baht per year. The essential export products are canned pineapple and pineapple juice, accounting for 45% of the value of processed fruit product exports. Thailand is the world’s number one exporter of canned pineapples, with a market share of approximately 50%. The key export markets include the European Union, the United States, Japan and the Middle East. In addition, the pineapple industry is significant for the economy at both the regional and farm levels. As stated by the Department of Agriculture, the national pineapple policy and development aims to promote and develop the pineapple industry. Among the pineapple varieties farmers grew, the most popular variety for industrial processing is the Pattavia pineapple. It has a dense texture and a moderately sweet or delightful taste, making it suitable for general cultivation. This variety belongs to the Smooth Cayenne group, which tastes mildly sweet and sour. It is known by various names such as Sriracha pineapple, Tadam pineapple, Red Eye pineapple, Galagattha pineapple or Pran Buri pineapple. Pineapple cultivation occurs in all regions of Thailand, with a high concentration in the central region, followed by other regions. The northeastern region also cultivates pineapples, although it has yet to receive official promotion to a significant extent. The four provinces in the northeastern region where pineapple cultivation is prevalent are Nong Khai, Chaiyaphum, Nakhon Phanom and Loei. The preferred variety in this region is the Pattavia pineapple. Based on available harvest data, it is found that Loei Province has the largest cultivation area, followed by Nong Khai, Nakhon Phanom and Chaiyaphum provinces, respectively. This variation in pineapple production is due to the different topography of the regions and variations in the production process and production costs among farmers in each province, resulting in different quantities of harvested produce.

According to the Pineapple National Strategy for 2017–2026, set by the National Pineapple Policy and Development Committee, Thailand’s pineapple industry can be a significant producer and exporter of pineapple products globally. In order to maintain leadership in production, processing, and marketing, it is necessary to address weaknesses and limitations in various aspects to compete with other rivals. This includes accelerating the development of strengths and creating opportunities to enhance competitiveness. The objectives are as follows: (1) Increase the efficiency of high-quality pineapple production and reduce production costs, (2) Enhance competitiveness in pineapple exports and maintain Thailand’s position as the world’s top exporter, (3) Maintain price stability and ensure the quality standard of pineapple products and (4) Promote sustainability in the livelihoods of pineapple farmers and processing factories. Next, the strategy consists of four main components: production strategy, processing strategy, marketing strategy and management strategy. The production strategy is the first and urgent strategy to be implemented. For example, the following measures are included: (1) Promote pineapple production in suitable areas, including both factories and fresh consumption, based on the Agricultural Map for Adaptive Management (Agri-Map), (2) Encourage pineapple production based on good agricultural practices (GAP) among farmers, adopting large-scale farming systems. Additionally, land consolidation is encouraged to reduce production costs and improve the quality of products in line with market demand, and (3) Facilitate knowledge transfer on pineapple cultivation to farmers, enabling them to become professional producers.

Therefore, if pineapple cultivation is managed to ensure an appropriate quantity of pineapples in each period, in line with the needs of the industrial factory, help in planning pineapple cultivation and productivity distribution to increase income for income farmers would be beneficial economically, socially, and industrially. It is another avenue to achieve the objectives of the pineapple strategic agenda set by the National Pineapple Policy and Development Committee. Suppose farmers have reasonable planning assistance from the beginning of productivity, aligned with productivity strategies and an urgent roadmap for the pineapple strategy for 2017–2026. In that case, it will promote pineapple productivity according to GAP for farmers in the form of large-scale agricultural promotion systems, including agro-industrial zones for farmers to reduce productivity costs and increase the quality of productivity in line with market demands. This will have positive effects on farmers themselves and the sustainable future of agriculture in the country. Currently, the National Pineapple Policy and Development Committee has adjusted the pineapple strategy for 2017–2026 in some aspects to align with the national strategy, using the principles of pineapple productivity management consistent with the policy. This involves collaboration between the agencies under the National Pineapple Policy and Development Committee and the committee responsible for resolving agricultural issues arising from provincial-level agricultural productivity. They have compiled data on productivity, marketing, and pineapple productivity management plans during regular periods and periods when there is a large volume of pineapple productivity for the market.

The overall issue mentioned above is the basis for this research: the design and development of software innovation for Pattavia pineapple cultivation and productivity distribution planning to increase income for farmers. Secondary data surveys revealed that farmers need more planning and management in Pattavia pineapple cultivation and productivity distribution. Conversely, there are periods of Pattavia pineapple surplus due to simultaneous harvests, resulting in low prices due to excessive supply in the market. Therefore, to efficiently plan productivity, it is necessary to apply technology and innovation to manage Pattavia pineapple cultivation and productivity distribution. At the same time, Pattavia pineapples must maintain quality and standards that align with market and factory requirements. This research will begin with studying and collecting data on Pattavia pineapple cultivation and productivity distribution channels in the real problem. Then, computer programs will be developed to calculate the most suitable solutions for Pattavia pineapple productivity distribution planning using extreme learning machine (ELM) with metaheuristic methods approach.

In this study, the dataset involves ten years of Pattavia pineapple productivity for the Loei and Nong Khai provinces in Thailand. The time series data of the relevant parameters were also collected for forecasting Pattavia pineapple productivity in the Loei province and the Nong Khai province in Thailand. The dataset is available from the Office of Agricultural Economics in Thailand. This research formulated and introduced an innovative machine learning (ML) model, the new model is called a new intelligent particle swarm optimization (PSO) algorithm with extreme learning (NIPSO-ELM), to forecast Pattavia pineapple productivity with a notable degree of precision and dependability. In this work, the standard ELM was built, and assessed for its ability to forecast the productivity of Pattavia pineapples. The findings indicate that the ELM neural network is an innovative model characterized by its straightforward architecture and exceptional performance. Moreover, the utilization of PSO, ant colony optimization (ACO) and NIPSO algorithms significantly enhanced ELM performance when forecasting the productivity of Pattavia pineapples. The NIPSO-ELM model emerged as the most optimal ML model for forecasting the productivity of Pattavia pineapples in practical scenarios. This research uses MATLAB R2022a for testing the Pattavia pineapple productivity. Government agencies can use the innovation above to manage Pattavia pineapple cultivation among farmers as data for agricultural planning coaches. Loei and Nong Khai provinces in Thailand, which have the highest Pattavia pineapple productivity in the northeastern region, are suitable areas to create a model for this research. Upon completion of the research project, it will provide new tools and innovations for efficient Pattavia pineapple cultivation planning and enable their application to other pineapple types and different areas, as well as other agricultural products in the future.

The present article has been structured in the following manner: Section 2 provides a comprehensive analysis of the existing literature. Section 3 provides a comprehensive overview of the methods employed to elucidate the suggested ELM, metaheuristic approach, and a new method in this study. Section 4 of the manuscript presents the computational experiment and its corresponding results, while Section 5 provides a comprehensive conclusion to the research.

2. Literature review

The utilization of artificial intelligence (AI) in the prediction has shown significant growth in recent years. Various domains of application and research have witnessed notable progress in the field of AI as documented by several studies (Zhang et al., Citation2021; Yu et al., Citation2022; Yan et al., Citation2020; Wang, Luo, et al., Citation2017; Wullapa & Suntaree, Citation2020). The expansion of AI applications can be classified into three primary categories. (1) The ability to accurately identify inherent data patterns with limited or no reliance on domain-specific knowledge input. (2) The availability of data and hardware in a mutually iterative process with AI. The practicality and value of AI, intense learning models, are enhanced by utilizing high-performance hardware and accumulating massive data collections. As a result, a more significant amount of data is accumulated, and more advanced technology is created in order to facilitate the investigation of the untapped potential of AI. The democratization of AI technology involves multiple facets, such as the accessibility of educational resources, open-source programming packages and deployment platforms. The integration of these three elements has played a substantial role in the present expansion of AI applications, as supported by several studies (Mayuree & Wullapa, Citation2014; Ke et al., Citation2021; Yu et al., Citation2021).

ELMs represent a distinct artificial neural networks (ANNs) category. Huang et al. (Citation2006) designed a fast ML model consisting of single layer feedforward neural network called the ELM that is computationally far more efficient (Yaseen et al., Citation2019). It consists of a single layer configuration and employs a random initialization of the parameters in the input layer, including weights and biases. This approach was first described by Huang et al. in their research (Huang et al., Citation2004). In this manner, the parameters of the output layer can be readily computed by reducing the output error of the ANN. The ELMs have been employed for forecasting, as documented in reference (Van Heeswijk et al., Citation2009). Furthermore, the interconnection of these layers is intricately associated with constructing a feed-forward neural network with a solitary, hidden layer. Random weights are assigned to the input layer of the ELM model. In contrast, the computation of linear algebra in the output layer is performed as a predetermined training approach (Ertugrul, Citation2016; Ye & Qin, Citation2015). Consequently, the training phase exhibits a remarkably rapid learning speed and a substantial generalization capacity (Huang, Wang, et al., Citation2011; Huang, Zhou, et al., Citation2012).

Due to its non-iterative tuning approaches, the ELM model demonstrated a shorter training time in the hidden layer than other ML models. To attain desirable performance and expedite convergence, it is imperative for ELMs to exhibit a substantial number of neurons in the hidden layer. A significant differentiation between ELMs and traditional ML models is the utilization of randomly given input weight and bias values in ELMs, as opposed to the typically employed gradient-descent-based methods in conventional ML models. The values above remain unaltered during the process of acquiring knowledge. This methodology effectively addresses various common challenges typically encountered in gradient descent methods. These challenges include the requirement for iterative modification of weight and bias values, vulnerability to becoming trapped in local minima, and suboptimal convergence speed. Nevertheless, the determination of the ideal quantity of neurons for the concealed layer in ELMs remains a topic of persistent investigation. Various optimization strategies can be employed to optimize the ELM model in data mining (Kochenderfer & Wheeler, Citation2019). Nonetheless, their respective functions exhibit similarities, namely in optimizing the ELM parameters. However, it is worth noting that their performance may exhibit subtle variations.

The elementary ELM exhibits variations in its generalization capacity due to randomly generated input weights and hidden layer thresholds. Nevertheless, the predictive capability of ELM can be significantly improved by incorporating forecasting with a regularization parameter. Swarm intelligence (SI) is a widely recognized metaheuristic that uses AI techniques to address optimization problems. It draws inspiration from the natural behavior of swarms (Raslan et al., Citation2020). Several SI algorithms have been created over time, such as ACO (Dorigo et al., Citation2010), PSO (Kennedy & Eberhart, Citation1995), artificial bee colony (ABC) (Karaboga & Basturk, Citation2008), the firefly algorithm (FA) (Yang, Citation2009) and cuckoo search (CS) (Gandomi et al., Citation2013). The ACO feature selection process, introduced by Dorigo and Di Caro (Citation1999), has been extensively employed in many applications (Mullen et al., Citation2009; Sweetlin et al., Citation2017; Cordon et al., Citation2002; Singh et al., Citation2012). In the ACO process, a parameter called pheromone is assigned to predictor stations, which initially classifies these predictors with the target/test stations. The trial pheromone value calculates the likelihood of selecting the training station over the test station. The magnitude pheromone is modified as the training stations are navigated, enhancing the probability for future ants to choose the optimal station. For more comprehensive information on the ACO technique experiment, readers can refer to the literature listed as Kumar et al. (Citation2020) and Paniri et al. (Citation2020). PSO algorithm, introduced by Kennedy and Eberhart (Citation1995), has recently attracted considerable academic attention and practical application. The primary reason is the algorithm’s clear and concise presentation, easy-to-understand nature and minimal need for adjusting parameters (Han & Liu, Citation2014; Pare et al., Citation2019; Tran et al., Citation2016). The PSO method has robust adaptability in tackling various optimization problems, encompassing single-objective and multi-objective scenarios (Li et al., Citation2017; Lv et al., Citation2019). The broad application of this versatile approach in several real-world optimization issues, such as the optimization of neural networks, has been documented by Wang et al. (Citation2018) and Han and Zhu (Citation2011). However, PSO has disadvantages, such as premature convergence and vulnerability to getting trapped in local minima (Li et al., Citation2015). This is especially apparent when dealing with situations with many dimensions (Han & Liu, Citation2015). Considerable academic investigation has been carried out on this issue, suggesting many iterations to improve the effectiveness of the PSO technique.

The researchers cited in this text (Alameer et al., Citation2019; Lalwani et al., Citation2019; Luan et al., Citation2019). Consequently, two innovative hybrid AI models, GA-ELM and PSO-ELM, have been created and suggested to forecast. An analysis of the published articles reveals that the GA-ELM and PSO-ELM models were examined and suggested for regression problems by Chu et al. (Citation2018), Figueiredo and Ludermir (Citation2014), Pahuja and Nagabhushan (Citation2016) and Wang, Wang, et al. (Citation2017). A literature review revealed that the Pattavia pineapple productivity was not forecast using ELM. Moreover, hybrid models for Pattavia pineapple productivity forecasting that combine optimization methods and the ELM model have yet to be studied. Various optimization techniques have been developed in data mining and can be utilized to optimize the ELM model (Kochenderfer & Wheeler, Citation2019). Nonetheless, their respective functions exhibit similarity in optimizing the parameters of the ELM model, but with potential variations in their overall performance. Thus, following the suggestions of earlier researchers (Lalwani et al., Citation2019; Eabsirimatee et al., Citation2016), PSO was chosen as the standard optimization algorithm for the ELM model in this study. Consequently, this study introduces innovative hybrid AI models, namely PSO-ELM, which have been devised and suggested to predict Pattavia pineapple productivity goods. The available literature reveals that the PSO-ELM models have been explored and suggested for forecasting problems (Chu et al., Citation2018; Figueiredo & Ludermir, Citation2014). However, the result of Pattavia pineapple productivity forecasting from the ELM was unstable and its accuracy was increased by reducing overfitting of the ELM model. In this research, metaheuristic optimization combined with the ELM is proposed to increase accuracy and reduce the cause of overfitting of forecasting models. However, more development and proposals for forecasting pineapple productivity still need to be developed. Nevertheless, the utilization of the ELM model for this purpose has yet to be accomplished thus far. Moreover, it is noteworthy to emphasize that the existing models proposed for other situations are not applicable for forecasting Pattavia pineapple productivity due to the distinct variables and dataset characteristics involved. Hence, the PSO-ELM model employed in this study for forecasting of Pattavia pineapple productivity are considered innovative. Furthermore, the conventional ELM model was also considered and refined to forecast Pattavia pineapple productivity goods. This model was compared with the ELM, PSO-ELM, ACO-ELM and newly proposed models.

3. Methodology

This study used AI models, namely ELM, PSO-ELM, ACO-ELM and the new model to forecast Pattavia pineapple productivity. The ANN has been extensively discussed in various scholarly publications and literary sources (Bui et al., Citation2021; Livingstone, Citation2008; Zhang, Nguyen, Bui, Nguyen-Thoi, et al., Citation2020; Zhang, Nguyen, Bui, Le, et al., Citation2020). Consequently, these elements are omitted from the present part. This study aims to introduce innovative hybrid models, specifically PSO-ELM, for the prediction of Pattavia pineapple productivity. Consequently, this part will provide an overview of the principles of ELM and PSO, as well as the framework of the newly proposed models. The new model is called a NIPSO-ELM that proposed models.

3.1. Extreme learning machine

The ELM is a ML algorithm that is classified within the category of feedforward neural networks. The purpose of this system is to facilitate the quick and expedient training of neural networks, with a specific focus on supervised learning tasks such as classification and regression. The introduction of ELM was initially documented by Huang et al. (Citation2004), Huang, Wang, et al. (Citation2011), Huang, Zhou, et al. (Citation2012) and Babri et al. (Citation2000). The ELM (Sattar et al., Citation2019) has been proposed as a solution to address the limitations of the backpropagation (BP) algorithm in training ANN models. While the BP algorithm is widely recognized for its effectiveness in determining the weight and bias of ANN models through tuning, it suffers from drawbacks such as poor learning speed and a tendency to converge to local minima. The ELM algorithm offers a potential remedy to these issues. The task of designing the architecture of an ANN is a significant challenge, particularly in relation to the arrangement of hidden layers and neurons inside the network (Nguyen et al., Citation2018, Citation2019). In contrast, the ELM model is characterized by its singular concealed layer within its architectural framework (). The interconnections between these layers are intricately associated with the construction of a feedforward neural network with a solitary hidden layer. Random weights are assigned to the input layer of the ELM model. In contrast, the process of computing them in the output layer is performed using a predetermined training technique in the field of linear algebra (Ertugrul, Citation2016; Ye & Qin, Citation2015). Consequently, the training phase exhibits a rapid learning rate and possesses a substantial capability for generalization (Huang et al., Citation2006; Huang, Wang, et al., Citation2011; Huang, Zhou, et al., Citation2012). The computation of the output of the ELM network is ultimately determined by the following procedure: (1) ypred=k=1mδkfa(i=1nwi,kxi+bk)(1)

Figure 1. An ELM model for forecasting Pattavia pineapple productivity.

Figure 1. An ELM model for forecasting Pattavia pineapple productivity.

The variable ypred represents the output of the network, specifically the Pattavia pineapple productivity. The variable x represents the inputs of the model. The variables n and m represent the number of input variables and neurons in the hidden layer, respectively. The weight of the kth neuron in the output layer is denoted as. The weight between the ith and kth neurons in the hidden layer is denoted as wi,k, whereas bk represents the biases of the hidden neurons. The activation function utilized in the ELM is denoted by the symbol fa.

3.2. Particle swarm optimization

PSO is an algorithm based on swarm theory that was introduced by Kennedy and Eberhart (Citation1995). The development of this concept draws inspiration from the social communication and interaction observed in swarms, such as those observed in birds and fishes. Consequently, the utilization of the collective behavior of sharing information inside a group is employed to enhance practical issues. Every herd member functions as a particle, engaging in movement inside the search space to locate food sources. This movement is executed with a predetermined velocity range. The velocity of individuals can be consistently adjusted to facilitate the sharing of their experiences, with the ultimate goal of attaining more favorable positions. Multiple iterations can be employed to enhance the precision of the search process. During each iteration, the local best position, corresponding to the best fitness value, is recorded in the PSO algorithm. Furthermore, the optimal placements obtained at each iteration are documented as the global best of the PSO algorithm (Zhang et al., Citation2019; Zhang, Nguyen, Bui, Nguyen-Thoi, et al., Citation2020; Zhang, Nguyen, Bui, Le, et al., Citation2020). Ultimately, the optimal position is the most favorable choice for the collective group. The pseudocode and the specifics of the computation for the PSO algorithm are illustrated in Algorithm 1.

Algorithm 1

PSO

  • 1: for each particle i do

  • 2:  for each dimension d do

  • 3:   Initialize position xid randomly within permissible range

  • 4:   Initialize velocity vid randomly within permissible range

  • 5:  end for

  • 6: end for

  • 7: Iteration k = 1

  • 8: do

  • 9: for each particle i do

  • 10:  Calculate fitness value

  • 11:  if the fitness value is better than pbestid in history then

  • 12:   Set current fitness value as the pbestid

  • 13:  end if

  • 14: end for

  • 15: Choose the particle having the best fitness value as the gbestid

  • 16: for each particle i do

  • 17:  for each dimension d do

  • 18:   Calculate velocity according to the equation

  • 19:   vji+1=wvj(i)+(c1×r1×(localbestjxj(i)))+(c2×r2×(globalbestjxj(i))),vminvj(i)vmax

  • 20:   Update particle position according to the equation

  • 21:   xji+1=xj(i)+vj(i+1);j=1,2,3,,n

  • 22:  end for

  • 23: end for

  • 24: k=k+1

  • 25: while maximum iterations or minimum error criteria are not attained

3.3. Ant colony optimization (ACO)

The ACO feature selection process, introduced by Dorigo and Di Caro (Citation1999), has been extensively employed in many applications (Mullen et al., Citation2009; Sweetlin et al., Citation2017; Cordon et al., Citation2002; Singh et al., Citation2012). In the ACO process, a parameter called pheromone is assigned to predictor stations, which initially classifies these predictors with the target/test stations. The trial pheromone value calculates the likelihood of selecting the training station over the test station. The magnitude pheromone is modified as the training stations are navigated, enhancing the probability for future ants to choose the optimal station. For more comprehensive information on the ACO technique experiment, readers can refer to the literature listed as Kumar et al. (Citation2020) and Paniri et al. (Citation2020).

3.4. PSO-ELM and a new models

The primary objective of this study is to present innovative intelligence models that utilize ML and optimization techniques to Pattavia pineapple productivity forecasts accurately. The fundamental objective of this study is to utilize the ELM model and the PSO algorithm in combination, specifically referred to as the PSO-ELM model, for the aim of generating unique models. In order to achieve this objective, a dataset including 70% of the Pattavia pineapple goods database was utilized to construct an initial ELM model. Following this, the enhanced PSO methods were employed to determine and compute the weights of the ELM model, as opposed to the random weights typically utilized in the conventional ELM model. The researchers employed the mean square error (MSE) as the chosen objective function to assess the PSO-ELM effectiveness and a NIPSO-ELM. The NIPSO algorithm is a modified version of the original PSO algorithm, as shown in Algorithm 2. The key distinction of NIPSO lies in its approach, which differs from the conventional PSO algorithm. The NIPSO is employed in mutation breeding, an agricultural technology extensively utilized to develop novel plant varieties. As of the conclusion 2009, 3088 mutagenic plant types had been introduced (Bradshaw, Citation2016). A mutation breeding procedure is periodically executed to update the parameter. the number of particles as n, the number of dimensions to d, the mutation probability to pm, and the cycle of the mutation breeding operation as cm. The ELM, PSO-ELM, and NIPSO-ELM models that exhibit the lowest MSE values are considered the most optimal. The proposed framework of this technique is illustrated in .

Figure 2. The framework of the optimization models for forecasting Pattavia pineapple productivity.

Figure 2. The framework of the optimization models for forecasting Pattavia pineapple productivity.

Algorithm 2

NIPSO

  • 1: for each particle i do

  • 2:  for each dimension d do

  • 3:   Initialize position xid randomly within permissible range

  • 4:   Initialize velocity vid randomly within permissible range

  • 5:  end for

  • 6: end for

  • 7: Iteration k = 1

  • 8: do

  • 9: for each particle i do

  • 10:  Calculate fitness value

  • 11:  if the fitness value is better than pbestid in history then

  • 12:   Set current fitness value as the pbestid

  • 13:  end if

  • 14: end for

  • 15: Choose the particle having the best fitness value as the gbestid

  • 16: for each particle i do

  • 16:  for each dimension d do

  • 18:   Calculate velocity according to the equation

  • 19:   vji+1=wvj(i)+(c1×r1×(localbestjxj(i)))+(c2×r2×(globalbestjxj(i))),vminvj(i)vmax

  • 20:   Update particle position according to the equation

  • 21:   xji+1=xj(i)+vj(i+1);j=1,2,3,,n

  • 22:   if mutation breeding operation was performed at last iteration then

  • 23:    localbestj=xj(i)

  • 24:   else

  • 25:    localbestj= the best position between xj(i) and localbestj

  • 26:  end if

  • 27: end for

  • 28: end for

  • 29: if mutation breeding operation was performed at last iteration then

  • 30:  globalbestj= the best position in x

  • 31: else

  • 32:  globalbestj= the best position between globalbestj and the best position in x

  • 33: end if

  • 34: if mod (k, cm) then

  • 35:  for each particle, i = 1 to n do

  • 36:   localbestj = globalbestj

  • 37:  end for

  • 38:  for each particle, i = 1 to n do

  • 39:   for each dimension, j = 1 to n do

  • 40:    r= random number of 0 to 1

  • 41:    if rpm then

  • 42:     localbestj(i)=a random value in the jth dimension range

  • 43:    end if

  • 44:   end for

  • 45:  end for

  • 46: end if

  • 47: k=k+1

  • 48: while maximum iterations or minimum error criteria are not attained

3.5. Evaluating forecasting models

The fundamental approach to forecasting can be outlined in five steps (Cordon et al., Citation2002). Initially, the forecasting task problem is delineated; the objective of this stage is to analyze the problem and variables that can impact the result of forecasting. Furthermore, this stage aims to acquire and analyze pertinent data required for forecasting and determining the anticipated outcome (Singh et al., Citation2012). Furthermore, the exploratory analysis of the overall forecasting is performed to actively examine the coherence of the data, ensuring that they can be implemented without any interference or missing elements in the dataset, such as data imbalance or inconsistency (Kumar et al., Citation2020), as well as missing values (Paniri et al., Citation2020), among other factors. Furthermore, selecting appropriate models is crucial in this research as it aims to present a novel model that enhances predicting accuracy. Furthermore, the forecasting model is assessed to examine and appraise the empirical outcomes derived from all models and subsequently choose the most optimal model for forecasting. The performance of all models (Kennedy & Eberhart, Citation1995) can be assessed by evaluating error metrics such as the mean absolute error (MAE), mean absolute percentage error (MAPE), root MSE (RMSE) and other similar measures.

In the present research, the ELM, PSO-ELM and NIPSO-ELM models are calculated through a variety of statistical criteria, as shown in EquationEqs. (2)–(5), including MAE, mean absolute scaled error (MASE), MAPE and RMSE. (2) MAE=1n·i=1n|ypyp̂|(2) (3) MASE=1n·i=1n[|ypyp̂|1n1·|ypyp̂1|](3) (4) MAPE=100%n·i=1n|ypyp̂yp|(4) (5) RMSE=1n·i=1n(ypyp̂)2(5)

The statistical criteria, MAE is a standard metric used to evaluate the accuracy of a forecasting model, MASE is a measure for determining the effectiveness of forecasts generated through an algorithm by comparing the predictions with the output of a forecasting approach, MAPE is a statistical measure to define the accuracy of a ML algorithm on a particular dataset, and RMSE is the square root of the mean of the square of all errors. Where n is the samples of the Pattavia pineapple productivity (tons), yp and yp̂ are the actual and forecast of the Pattavia pineapple productivity (tons).

4. Computational experiment and results

4.1. Dataset

In this study, the dataset involves ten years of Pattavia pineapple productivity for the Loei and Nong Khai provinces in Thailand (ton), i.e. from 2013 to 2022. The box plot visualization, as shown in . First, the time-series data of the relevant parameters were also collected for forecasting Pattavia pineapple productivity in the Loei province in Thailand, such as n1: Pattavia pineapple cultivated area in Loei province (Rai), n2: Pattavia pineapple harvested area in Loei province (Rai), n3: Pattavia pineapple productivity in Loei province per Rai per cultivated area (kg.), n4: Pattavia pineapple productivity in Loei province per Rai per harvested area (kg.), n5: Pattavia pineapple cultivated area in Northeastern (Rai), n6: Pattavia pineapple harvested area in Northeastern (Rai), n7: Pattavia pineapple productivity in Northeastern per Rai per cultivated area (kg.), n8: Pattavia pineapple productivity in Northeastern province per Rai per harvested area (kg.), and n9: Pattavia pineapple productivity in Thailand province per Rai per harvested area (kg.). Second, the time-series data of the relevant parameters were also collected for forecasting Pattavia pineapple productivity in the Nong Khai province in Thailand, such as n1: Pattavia pineapple cultivated area in Nong Khai province (Rai), n2: Pattavia pineapple harvested area in Nong Khai province (Rai), n3: Pattavia pineapple productivity in Nong Khai province per Rai per cultivated area (kg.), n4: Pattavia pineapple productivity in Nong Khai province per Rai per harvested area (kg.), n5: Pattavia pineapple cultivated area in Northeastern (Rai), n6: Pattavia pineapple harvested area in Northeastern (Rai), n7: Pattavia pineapple productivity in Northeastern per Rai per cultivated area (kg.), n8: Pattavia pineapple productivity in Northeastern province per Rai per harvested area (kg.), and n9: Pattavia pineapple productivity in Thailand province per Rai per harvested area (kg.). The details of the dataset used are illustrated and summarized in . The dataset is available on https://www.oae.go.th from the Office of Agricultural Economics in Thailand. This research uses MATLAB R2022a for macOS Ventura (version 13.5) testing the Pattavia pineapple productivity.

Figure 3. The nine inputs and one output for the forecast models, (a1) n1, (b1) n2, (c1) n3, (d1) n4 of Loei and (a2) n1, (b2) n2, (c2) n3, (d2) n4 of Nong Khai.

Figure 3. The nine inputs and one output for the forecast models, (a1) n1, (b1) n2, (c1) n3, (d1) n4 of Loei and (a2) n1, (b2) n2, (c2) n3, (d2) n4 of Nong Khai.

Figure 4. The nine inputs and one output for the forecast models, (e1) n5, (f1) n6, (g1) n7, (h1) n8 of Loei and (e2) n5, (f2) n6, (g2) n7, (h2) n8 of Nong Khai.

Figure 4. The nine inputs and one output for the forecast models, (e1) n5, (f1) n6, (g1) n7, (h1) n8 of Loei and (e2) n5, (f2) n6, (g2) n7, (h2) n8 of Nong Khai.

Figure 5. The nine inputs and one output for the forecast models, (i1) n9, (j1) Pattavia pineapple productivity of Loei and (i2) n9, (j2) Pattavia pineapple productivity of Nong Khai.

Figure 5. The nine inputs and one output for the forecast models, (i1) n9, (j1) Pattavia pineapple productivity of Loei and (i2) n9, (j2) Pattavia pineapple productivity of Nong Khai.

Table 1. Descriptive summary Pattavia pineapple Product of Loei and Nong Khai provinces in Thailand.

In order to construct the forecast models for this research, the time-series dataset, consisting of nine inputs and one output, was partitioned into two segments: 70% for in-sample (training) and 30% for out-of-sample (testing). The MinMax scaling technique was utilized to mitigate overfitting by transforming the scale intervals to the range of [0,1], as shown in . As previously stated, the architecture of ELM model is characterized by a singular hidden layer, necessitating the selection of the number of neurons inside this layer as the primary consideration in designing the ELM structure. In summary, the ELM model of 27 neurons had the lowest MSE. Hence, it is recommended that the ELM 9-27-1 model be employed as the most suitable framework for predicting the productivity of Pattavia pineapples. To develop an ANN model for the forecast model, an ANN model with one hidden layer to perform an ANN forecast. The primary motivation for choosing these ANN models is to conduct a comparative analysis with the ELM model in the context of predicting. Consequently, an ANN model with architecture identical to the ELM model, specifically a 9-27-1 structure, serves as the first ANN model. Ultimately, an ANN model was created to predict, explicitly utilizing an ANN architecture with 9 input nodes, 27 hidden nodes and 1 output node. The BP algorithm was employed to train these ANN models. In order to prevent overfitting of the ANN model, the MinMax scale approach was utilized, with a range of 0–1, as suggested by Zhang et al. (Citation2021).

Figure 6. The normalization in [0,1] of nine inputs for the forecast models, (a) Loei, and (b) Nong Khai.

Figure 6. The normalization in [0,1] of nine inputs for the forecast models, (a) Loei, and (b) Nong Khai.

After establishing the ideal configuration of the ELM, the PSO and NIPSO algorithms were incorporated to optimize the weights of the ELM neural network. The specification of the parameters for the PSO and NIPSO algorithms is a prerequisite for the optimization of the ELM network. The parameters of PSO algorithm: Vmax (maximum particle’s velocity) = 2; ci (individual cognitive) = 1.2; cg (group cognitive) = 1.2; w (weight of bird) = 0.85. Next, the parameters of NIPSO algorithm: Vmax (maximum particle’s velocity) = 2; ci (individual cognitive) = 1.2; cg (group cognitive) = 1.2; w (weight of bird) = 0.50, cm (cycle of the mutation breeding operation) =10; pm (mutation probability) =0.01. In order to examine the efficacy of the PSO-ELM and NIPSO models across varying population sizes (psize), the psize values were designated as 100, 150, 200, 250, 300, 350, 400, 450 and 500. In order to satisfy the termination criterion, precisely the lowest MSE, a total of 1000 iterations were configured for both the PSO-ELM and the NIPSO models. This study examines the training performance of the PSO-ELM and the NIPSO models. Ultimately, the optimal parameters for the PSO-ELM algorithm: Vmax = 2; ci = 1.2; cg = 1.2; w = 0.85 psize = 350; iteration = 917. The algorithm: Vmax = 2; ci = 1.2; cg = 1.2; w = 0.50, cm =10; pm =0.01 psize = 450; iteration = 427.

After the ELM neural network was calibrated using the NIPSO, PSO and ACO algorithms, the NIPSO-ELM, PSO-ELM ACO-ELM models were assessed for their performance using both in-sample and out-of-sample data. In order to elucidate the enhancements offered by the suggested ELM-based models, a comparison was conducted between the traditional ELM model (without optimization) and the NIPSO-ELM, PSO-ELM and ACO-ELM models. shows the models’ performance and results on the in-sample (training dataset) and out-of-sample (testing dataset). It is evident that all forecast models exhibited exceptional performance, as evidenced by the shallow errors, precisely the MAPE, MASE, MAE and RMSE. Integrating the PSO, the ACO, the NIPSO algorithms into the ELM model, and ANN model resulted in a noteworthy enhancement in accuracy. Specifically, the NIPSO-ELM, PSO-ELM and ACO-ELM models exhibited superior accuracy compared to the conventional ELM and ANN models. The correlation of actual Pattavia pineapple productivity with the forecast models, shown in , the NIPSO-ELM algorithm is the best correlation with actual Pattavia pineapple productivity. The NIPSO-ELM model demonstrated superior accuracy in forecasting Pattavia pineapple productivity compared to the PSO-ELM model, as evidenced by statistical criteria, in both the in-sample and out-of-sample scenarios. The present work highlights the significant impact of optimization techniques, particularly in integrating the ELM model. To clarify, the ELM models, specifically the NIPSO-ELM, PSO-ELM and ACO-ELM, exhibit notably enhanced accuracy compared to the ELM and ANN models that lacks optimization. Therefore, the NIPSO-ELM model has the highest accuracy as the best model for forecasting Pattavia pineapple productivity of Loei and Nong Khai provinces in Thailand, as shown in as shown in .

Figure 7. The correlation of actual Pattavia pineapple productivity with the forecast models, (a) Loei, and (b) Nong Khai.

Figure 7. The correlation of actual Pattavia pineapple productivity with the forecast models, (a) Loei, and (b) Nong Khai.

Figure 8. Actual Pattavia pineapple productivity with the forecast models (a) Loei, and (b) Nong Khai.

Figure 8. Actual Pattavia pineapple productivity with the forecast models (a) Loei, and (b) Nong Khai.

Table 2. The accuracy of the ELM, PSO-ELM, NIPSO-ELM models for Pattavia pineapple product of Loei and Nong Khai provinces in Thailand.

5. Conclusion

The forecast of Pattavia pineapple productivity exhibited a good level of accuracy. The investigation findings indicate that the productivity of Pattavia pineapple has exhibited intricate oscillations between the years 2013 and 2022. It is noteworthy that accurately predicting these fluctuations in Pattavia pineapple productivity is a significant challenge. This research endeavor formulated and introduced an innovative ML model, namely NIPSO-ELM, to forecast Pattavia pineapple productivity with a notable degree of precision and dependability. In this work, the standard ANN and ELM models were built and assessed for its ability to forecast the productivity of Pattavia pineapples. The findings indicate that the ELM neural network is an innovative model characterized by its straightforward architecture and exceptional performance.

Moreover, the utilization of PSO, ACO, NIPSO algorithms significantly enhanced ELM performance when forecasting the productivity of Pattavia pineapples. Among the several models considered, the NIPSO-ELM model emerged as the most optimal ML model for accurately, reliably and stably forecasting the productivity of Pattavia pineapples in practical scenarios. The most optimal NIPSO-ELM models for the Loei provinces in Thailand exhibit the following performance metrics: RMSE = 304.36389, MAE = 243.29531, MAPE = 0.03753 and MASE = 0.93157. The most optimal NIPSO-ELM models for the Nong Khai provinces in Thailand exhibit: RMSE = 304.57352, MAE = 244.67834, MAPE = 0.03756 and MASE = 0.93296, respectively. The utilization of this tool facilitates the assessment and contemplation of future Pattavia pineapple productivity, hence informing investment decisions in relevant fields.

Future work, most previous studies and this study performed the forecast of Pattavia pineapple productivity based on multivariate models and studied the ELM model with several metaheuristic algorithms. The main limitation of any metaheuristic algorithm is that there is always the possibility that new optimization approaches may be developed in the future that perform better in handling optimization applications.

Supplemental material

rotating.sty

Download (5.7 KB)

Training Testing NK.png

Download PNG Image (75.2 KB)

Training Testing Loei.png

Download PNG Image (69 KB)

Wullapa_Wongsinlatam_Revision_14.12.66.tex

Download Latex File (72.2 KB)

booktabs.sty

Download (6.4 KB)

interact.cls

Download (24.6 KB)

NIPSO_ELM-eps-converted-to.pdf

Download PDF (103.2 KB)

Acknowledgments

This work comes under the Fundamental Fund 2022, and the research project is the design and development of agricultural innovative application software for pineapple plantation management granted by the National Science, Research and Innovation Fund. The author thanks the Faculty of Interdisciplinary Studies, Khon Kaen University, Nong Khai Campus and the Khon Kaen University, Khon Kaen Campus in Thailand, for supporting this work.

Disclosure statement

No potential conflict of interest was reported by the author.

Additional information

Funding

The Research by Khon Kaen University, Faculty of Interdisciplinary Studies has received funding support from the National Science, Research and Innovation Fund.

References

  • Alameer, Z., Elaziz, M. A., Ewees, A. A., Ye, H., & Jianhua, Z. (2019). Forecasting copper prices using hybrid adaptive neuro-fuzzy inference system and genetic algorithms. Natural Resources Research, 28(4), 1385–1401. https://doi.org/10.1007/s11053-019-09473-w
  • Babri, H.A., Huang, G. B., & Chen, Y. Q. (2000). Classification ability of single hidden layer feedforward neural networksIEEE Transactions on Neural Networks, 11(3), 799–801. https://doi.org/10.1109/72.846750
  • Bradshaw, J. E. (2016). Mutation breeding. Plant breeding: Past, present and future. Springer.
  • Bui, X. N., Nguyen, H., Tran, Q. H., Nguyen, D. A., & Bui, H. B. (2021). Predicting ground vibrations due to mine blasting using a novel artificial neural network-based cuckoo search optimization. Natural Resources Research, 30(3), 2663–2685. https://doi.org/10.1007/s11053-021-09823-7
  • Chu, Z., Ma, Y., & Cui, J. (2018). Adaptive reactionless control strategy via the PSO-ELM algorithm for free-floating space robots during manipulation of unknown objects. Nonlinear Dynamics, 91(2), 1321–1335. https://doi.org/10.1007/s11071-017-3947-6
  • Cordon, O., Herrera, F., & Stützle, T. (2002). A review on the ant colony optimization metaheuristic: Basis, models and new trends. Mathware and Soft Computing, 9(2-3), 141–175. https://upcommons.upc.edu/bitstream/handle/2099/3624/1-cordon-herrera-stuetzle.pdf
  • Dorigo, M., & Di Caro, G. (1999). Ant colony optimization: A new meta-heuristic [Paper presentation]. Proceedings of the Congress on Evolutionary Computation CEC 1999 (Cat. No. 99TH8406), Washington, DC, USA, pp. 1470–1477. Vol. 2. https://doi.org/10.1109/CEC.1999.782657
  • Dorigo, M., Dorigo., & M., Birattari. (2010). Ant colony optimization. Encyclopedia of machine learning (pp. 36–39). Springer.
  • Eabsirimatee, P., Suthikarnnarunai, N., & Janpong, S. (2016). Thailand’s canned pineapple forecasting using multiple regression model and artificial neural network. Journal of Nakhonratchasima College, 10(2), 9–21. http://journal.nmc.ac.th/th/admin/Journal/2559Vol10No2_2.pdf
  • Ertugrul, O. F. (2016). Forecasting electricity load by a novel recurrent extreme learning machines approach. International Journal of Electrical Power & Energy Systems. 78, 429–435. https://doi.org/10.1016/j.ijepes.2015.12.006
  • Figueiredo, E. M., & Ludermir, T. B. (2014). Investigating the use of alternative topologies on performance of the PSO-ELM. Neurocomputing, 127, 4–12. https://doi.org/10.1016/j.neucom.2013.05.047
  • Gandomi, A. H., Yang, X. S., & Alavi, A. H. (2013). Cuckoo search algorithm: A metaheuristic approach to solve structural optimization problems. Engineering with Computers, 29(1), 17–35. https://doi.org/10.1007/s00366-011-0241-y
  • Han, F., & Liu, Q. (2014). A diversity-guided hybrid particle swarm optimization based on gradient search. Neurocomputing, 137, 234–240. https://doi.org/10.1016/j.neucom.2013.03.074
  • Han, F., & Liu, Q. (2015). An improved hybrid PSO based on ARPSO and the quasi-newton method. Advances in swarm and computational intelligence (pp. 460–467). Springer International Publishing.
  • Han, F., & Zhu, J. (2011). An improved ARPSO for feedforward neural networks. Proceedings of the Seventh International Conference Natural Computation (Vol. 2, pp. 1146–1150). Shanghai, China. https://doi.org/10.1109/ICNC.2011.6022153
  • Huang, G. B., Wang, D. H., & Lan, Y. (2011). Extreme learning machines: A survey. International Journal of Machine Learning and Cybernetics, 2(2), 107–122. https://doi.org/10.1007/s13042-011-0019-y
  • Huang, G. B., Zhou, H., Ding, X., & Zhang, R. (2012). Extreme learning machine for regression and multiclass classification. In IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) (Vol. 42, pp. 513–529). https://doi.org/10.1109/TSMCB.2011.2168604
  • Huang, G. B., Zhu, Q. Y., & Siew, C. K. (2004). Extreme learning machine: A new learning scheme of feedforward neural networks. 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541) (Vol. 2, pp. 985–990). Budapest, Hungary, 2004, https://doi.org/10.1109/IJCNN.2004.1380068
  • Huang, G. B., Zhu, Q. Y., & Siew, C. K. (2006). Extreme learning machine: Theory and applications. Neurocomputing, 70(1–3), 489–501. https://doi.org/10.1016/j.neucom.2005.12.126
  • Karaboga, D., & Basturk, B. (2008). On the performance of artificial bee colony (ABC) algorithm. Applied Soft Computing. 8(1), 687–697. https://doi.org/10.1016/j.asoc.2007.05.007
  • Ke, B., Nguyen, H., Bui, X. N., Bui, H. B., Choi, Y., Zhou, J., Moayedi, H., Costache, R., & Nguyen-Trang, T. (2021). Predicting the sorption efficiency of heavy metal based on the biochar characteristics, metal sources, and environmental conditions using various novel hybrid machine learning models. Chemosphere, 276, 130204. https://doi.org/10.1016/j.chemosphere.2021.130204
  • Kennedy, J., & Eberhart, R. (1995). Particle swarm optimization. Proceedings of the ICNN’95–International Conference on Neural Networks (Vol. 4, pp. 1942–1948). 27 November–1 December. Perth, WA, Australia. https://doi.org/10.1109/ICNN.1995.488968
  • Kochenderfer, M. J., & Wheeler, T. A. (2019). Algorithms for optimization. Mit Press.
  • Kumar, S., Solanki, V. K., Choudhary, S. K., Selamat, A., & Crespo, R. G. (2020). Comparative study on ant colony optimization (ACO) and K-means clustering approaches for jobs scheduling and energy optimization model in internet of things (IoT). International Journal of Interactive Multimedia and Artificial Intelligence, 6(1), 107. https://doi.org/10.9781/ijimai.2020.01.003
  • Lalwani, S., Sharma, H., Satapathy, S. C., Deep, K., & Bansal, J. C. (2019). A survey on parallel particle swarm optimization algorithms. Arabian Journal for Science and Engineering, 44(4), 2899–2923. https://doi.org/10.1007/s13369-018-03713-6
  • Li, L., Wang, W., & Xu, X. (2017). Multi-objective particle swarm optimization based on global margin ranking. I Nformation Sciences, 375, 30–47. https://doi.org/10.1016/j.ins.2016.08.043
  • Li, Y., Zhan, Z.-H., Lin, S., Zhang, J., & Luo, X. (2015). Competitive and cooperative particle swarm optimization with information sharing mechanism for global optimization problems. Information Sciences, 293, 370–382. https://doi.org/10.1016/j.ins.2014.09.030
  • Livingstone, D. J. (2008). Artificial neural networks: Methods and applications. Springer.
  • Luan, J., Yao, Z., Zhao, F., & Song, X. (2019). A novel method to solve supplier selection problem: Hybrid algorithm of genetic algorithm and ant colony optimization. Mathematics and Computers in Simulation, 156, 294–309. https://doi.org/10.1016/j.matcom.2018.08.011
  • Lv, Z., Wang, L., Han, Z., Zhao, J., & Wang, W. (2019). Surrogate-assisted particle swarm optimization algorithm with pareto active learning for expensive multi-objective optimization. IEEE/CAA Journal of Automatica Sinica, 6(3), 838–849. https://doi.org/10.1109/JAS.2019.1911450
  • Mayuree, S., & Wullapa, W. (2014). Prediction model for crude oil price using artificial neural networks. Applied Mathematical Sciences, 8(80), 3953–3965. http://dx.doi.org/10.12988/ams.2014.43193
  • Mullen, R. J., Monekosso, D., Barman, S., & Remagnino, P. (2009). A review of ant algorithms. Expert Systems with Applications. 36(6), 9608–9617. https://doi.org/10.1016/j.eswa.2009.01.020
  • Nguyen, H., Bui, X. N., Bui, H. B., & Mai, N. L. (2018). A comparative study of artificial neural networks in predicting blast-induced air-blast overpressure at Deo Nai open-pit coal mine Vietnam. Neural Computing and Applications, 32(8), 3939–3955. https://doi.org/10.1007/s00521-018-3717-5
  • Nguyen, H., Drebenstedt, C., Bui, X.-N., & Bui, D. T. (2019). Prediction of blast-induced ground vibration in an open-pit mine by a novel hybrid model based on clustering and artificial neural network. Natural Resources Research, 29(2), 691–709. https://doi.org/10.1007/s11053-019-09470-z
  • Pahuja, G., & Nagabhushan, T. (2016). A novel GA-ELM approach for parkinson’s disease detection using brain structural T1-weighted MRI data [Paper presentation]. 2016 Second International Conference on Cognitive Computing and Information Processing (CCIP) (pp. 1–6). Mysuru, India. https://doi.org/10.1109/CCIP.2016.7802848
  • Paniri, M., Dowlatshahi, M. B., & Nezamabadi-Pour, H. (2020). MLACO: A multi-label feature selection algorithm based on ant colony optimization. Knowledge-Based Systems, 192, 105285. https://doi.org/10.1016/j.knosys.2019.105285
  • Pare, S., Kumar, A., Bajaj, V., & Singh, G. K. (2019). A context sensitive multilevel thresholding using swarm based algorithms. IEEE/CAA Journal of Automatica Sinica, 6, 1471–1486. https://doi.org/10.1109/JAS.2017.7510697
  • Raslan, A. F., Ali, A. F., & Darwish, A. (2020). Swarm intelligence algorithms and their applications in internet of things. Swarm intelligence for resource management in internet of things; intelligent data-centric systems (pp. 1–19). Academic Press
  • Sattar, A. M., Ertuğrul, Ö. F., Gharabaghi, B., McBean, E. A., & Cao, J. (2019). Extreme learning machine model for water network management. Neural Computing and Applications, 31(1), 157–169. https://doi.org/10.1007/s00521-017-2987-7
  • Singh, G., Kumar, N., & Kumar Verma, A. (2012). Ant colony algorithms in MANETs: A review. Journal of Network and Computer Applications, 35(6), 1964–1972. https://doi.org/10.1016/j.jnca.2012.07.018
  • Sweetlin, J. D., Nehemiah, H. K., & Kannan, A. (2017). Feature selection using ant colony optimization with tandem-run recruitment to diagnose bronchitis from CT scan images. Computer Methods and Programs in Biomedicine, 145, 115–125. https://doi.org/10.1016/j.cmpb.2017.04.009
  • Tran, B., Xue, B., Zhang, M., & Nguyen, S. (2016). Investigation on particle swarm optimisation for feature selection on high-dimensional data: Local search and selection bias. Connection Science, 28(3), 270–294. https://doi.org/10.1080/09540091.2016.1185392
  • Van Heeswijk, M., Miche, Y., Lindh-Knuutila, T., Hilbers, P. A., Honkela, T., Oja, E., & Lendasse, A. (2009). Adaptive ensemble models of extreme learning machines for time series prediction. International Conference on Artificial Neural Networks (pp. 305–314). Springer.
  • Wang, D., Luo, H., Grunder, O., Lin, Y., & Guo, H. (2017). Multi-step ahead electricity price forecasting using a hybrid model based on two-layer decomposition technique and BP neural network optimized by firefly algorithm. Applied Energy, 190, 390–407. https://doi.org/10.1016/j.apenergy.2016.12.134
  • Wang, F., Zhang, H., Li, K., Lin, Z., Yang, J., & Shen, X. L. (2018). A hybrid particle swarm optimization algorithm using adaptive learning strategy. Information Sciences, 436–437, 162–177. https://doi.org/10.1016/j.ins.2018.01.027
  • Wang, X., Wang, C., & Li, Q. (2017). Short-term wind power prediction using GA-ELM. The Open Electrical & Electronic Engineering Journal, 11, 48–56.
  • Wullapa, W., & Suntaree, B. (2020). Criminal cases forecasting model using A new intelligent hybrid artificial neural network with cuckoo search algorithm. IAENG International Journal of Computer Science, 47, 3.
  • Yan, X., Liu, Y., Xu, Y., & Jia, M. (2020). Multistep forecasting for diurnal wind speed based on hybrid deep learning model with improved singular spectrum decomposition. Energy Conversion and Management, 225, 113456. https://doi.org/10.1016/j.enconman.2020.113456
  • Yang, X. S. (2009). Firefly algorithms for multimodal optimization. In O. Watanabe, & T. Zeugmann (Eds.), Stochastic algorithms: Foundations and applications (pp. 169–178). Springer.
  • Yaseen, Z. M., Sulaiman, S. O., Deo, R. C., & Chau, K. W. (2019). An enhanced extreme learning machine model for river flow forecasting: State-of-the-art, practical applications in water resource engineering area and future research direction. Journal of Hydrology, 569, 387–408. https://doi.org/10.1016/j.jhydrol.2018.11.069
  • Ye, Y., & Qin, Y. (2015). QR factorization based incremental extreme learning machine with growth of hidden nodes. Pattern Recognition Letters, 65, 177–183. https://doi.org/10.1016/j.patrec.2015.07.031
  • Yu, L., Liang, S., Chen, R., & Lai, K. K. (2022). Predicting monthly biofuel production using a hybrid ensemble forecasting methodology. International Journal of Forecasting, 38(1), 3–20. https://doi.org/10.1016/j.ijforecast.2019.08.014
  • Yu, Z., Shi, X., Miao, X., Zhou, J., Khandelwal, M., Chen, X., & Qiu, Y. (2021). Intelligent modeling of blast-induced rock movement prediction using dimensional analysis and optimized artificial neural network technique. International Journal of Rock Mechanics and Mining Sciences. 143, 104794. https://doi.org/10.1016/j.ijrmms.2021.104794
  • Zhang, H., Nguyen, H., Bui, X. N., Nguyen-Thoi, T., Bui, T. T., Nguyen, N., Vu, D. A., Mahesh, V., & Moayedi, H. (2020). Developing a novel artificial intelligence model to estimate the capital cost of mining projects using deep neural network-based ant colony optimization algorithm. Resources Policy, 66, 101604. https://doi.org/10.1016/j.resourpol.2020.101604
  • Zhang, J., Meng, Y., Wei, J., Chen, J., & Qin, J. (2021). A novel hybrid deep learning model for sugar price orecasting based on time series decomposition. Mathematical Problems in Engineering, 2021, 1–9. https://doi.org/10.1155/2021/6507688
  • Zhang, X., Nguyen, H., Bui, X. N., Le, H. A., Nguyen-Thoi, T., Moayedi, H., & Mahesh, V. (2020). Evaluating and predicting the stability of roadways in tunnelling and underground space using artificial neural network-based particle swarm optimization. Tunnelling and Underground Space Technology, 103, 103517. https://doi.org/10.1016/j.tust.2020.103517
  • Zhang, X., Nguyen, H., Bui, X. N., Tran, Q. H., Nguyen, D. A., Bui, D. T., & Moayedi, H. (2019). Novel soft computing model for predicting blast-induced ground vibration in open-pit mines based on particle swarm optimization and XGBoost. Natural Resources Research, 29(2), 711–721. https://doi.org/10.1007/s11053-019-09492-7