300
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Control Charts for Exponentially Distributed Characteristics: SD, ED, ESD with Taguchi’s Loss Function

, &
Article: 2322362 | Received 18 Sep 2023, Accepted 18 Feb 2024, Published online: 06 Mar 2024

ABSTRACT

This paper addresses the challenge of quality characteristics that follow an exponential distribution, which can significantly impact decision-making in various fields. Existing approaches rely on approximations to convert exponential distributions to normal distributions, upon which control charts are constructed. However, such conversions introduce errors that can lead to incorrect outcomes, particularly for highly sensitive characteristics. To address this limitation, we propose the development of control charts specifically designed for exponential characteristics, without relying on approximations. Our objective is to introduce four different schemes for constructing these control charts: a statistical scheme, an economic scheme, an economic-statistical scheme combined with Taguchi’s loss function, and an economic-statistical scheme without the application of a loss function. To determine optimal design parameter values for each scheme, we employ the artificial bee colony algorithm. Additionally, we conduct a sensitivity analysis to investigate the impact of design parameters on each proposed design. To illustrate the practical implementation of these control charts, we provide a numerical example that demonstrates their effectiveness. By addressing the limitations of existing approaches and offering novel control chart designs, this paper contributes to enhancing decision-making accuracy and reliability in scenarios involving exponentially distributed quality characteristics.

Introduction

Due to the competition of markets for customer satisfaction, more sales, and higher profits, quality has become one of the most important issues under consideration in industries. Among the various control tools, statistical process control (SPC) is of great importance to the improvement of quality. Among the diverse statistical methods, control charts are important tools which monitor and control processes. Today, these tools are widely used in industries. For this reason, researchers are constantly looking for optimal (accurate, low-cost, and fast) control charts.

In the construction of control charts, the distribution of quality characteristics is usually considered normal. However, in many cases, this assumption is not true. This happens for quality characteristics whose probability plots are highly skewed. For example, we can refer to qualitative characteristics with lifetime nature.

Among the well-known statistical distributions, the exponential distribution has a good fit for most of the skewed data (Santiago and Smith Citation2013) such as:

  1. Waiting times or inter-arrival times: When studying the time between consecutive events (e.g., customer arrivals, phone calls, or requests), the exponential distribution can be appropriate if the underlying process exhibits randomness and no memory, commonly known as the memoryless property.

  2. Survival analysis: In certain cases, the exponential distribution can be employed to model the survival times of a population when the hazard rate (probability of an event occurring at a given time) remains constant over time. This assumption is often referred to as an exponential survival function.

  3. Reliability analysis: When analyzing the lifetime or failure rate of systems or components, the exponential distribution can be utilized if the failure rate remains constant or exponentially decreasing over time. It assumes that the failure events are independent and do not depend on prior events.

  4. Queueing systems: In some queueing configurations, where the service times follow an exponential distribution and the arrival process satisfies certain assumptions (e.g., Poisson arrivals), the exponential distribution can be used to model the waiting times in the system accurately.

One of the reasons for using the exponential distribution for the distribution of a time characteristic is when the defect rate in a process is very low. In this case, instead of using C-charts and U-charts for the number of defects, they consider the times between the occurrence of two consecutive defects and plot a control chart for these times (Kumar Citation2022). If the defect rate in the process is low, for example, less than 1000 defects per million, the time between observing defective products will be long. Under such conditions, most samples will be defect-free, and a control chart that constantly plots a statistic at zero does not provide much useful information. Therefore, when the defect rates are expressed as parts per million (PPM), traditional C and U-charts are not effective. One way to address this issue is to use a new variable such as the time between consecutive defect observations (Kumar, Chakraborti, and Castagliola Citation2022). The control chart for time between observations is one of the most successful methods for controlling processes with very low defect rates. Suppose defects or observations of interest follow a Poisson distribution. The choice of the Poisson distribution is commonly made in situations where we expect rare events to occur independently over a specific time or space interval. In our study, we have chosen to assume a Poisson distribution for defects or observations of interest based on the nature of the phenomena under investigation and previous research in the field. By assuming the Poisson distribution, we aim to simplify the analysis and make reasonable approximations that align with existing knowledge in the field. In this case, the time between observations will have a probability distribution. In such conditions, designing a control chart for time between observations involves a variable that follows an exponential distribution. However, the exponential distribution has high skewness, resulting in a highly asymmetric control chart (Montgomery Citation2020).

The exponential distribution is used in many fields, including reliability engineering (time-to-failure of components), queueing theory (inter-arrival times of customers in a queue), telecommunications (time between successive arrivals of messages or packets), physics (decay of radioactive materials), finance (time to default of bonds and financial instruments), biology (waiting time between cell divisions, mutations, and evolutionary events), and epidemiology (time between infections during an epidemic).

Many researchers have focused on control charts for skewed type of quality characteristic. For example, Bai and Choi (Citation1995) studied Xˉ and R control charts for skewed qualities, while Choobineh and Ballard (Citation1987) developed a control chart for skewed data using weighted variance. Also Morales and Arturo Panza (Citation2022) and Figueiredo and Ivette Gomes (Citation2013) presented control charts for skewed data following a skew-normal distribution, while Al-Nuaami, Akbar Heydari, and Jabbari Khamnei (Citation2023) proposed control charts specifically for counting data with overdispersion.

Santiago and Smith (Citation2013) investigated the t-chart assuming exponentially distributed quality characteristics and time until the first deviation. They approximated the quality characteristic distribution to a normal distribution using Nelson’s approximation (Nelson Citation1994) and constructed the appropriate control chart based on conventional 6σ (Shewhart) methods. Aslam et al. (Citation2015) used a variable sampling model and the same approximation to normalize feature distribution. Tavakoli and Akbar Heydari (Citation2021) built an exponential control chart using Nelson’s approximation and integrated economic statistical design with a loss function.

However, it is important to acknowledge that such approximations can introduce errors in the obtained results. With our research, we aim to address this issue by directly constructing a control chart for exponentially distributed quality characteristics, thereby circumventing the need for such approximations. By eliminating the intermediate step of approximating to a normal distribution, we expect our proposed method to provide more accurate results. Thus, this paper proposes directly constructing a control chart for exponentially distributed features.

Control charts may have different purposes depending on the designer’s strategy and the type of design. These designs may focus on statistical or economic aspects, or both. Thus, control charts can be designed in three ways: statistical design (SD), economic design (ED), and/or statistical-economic design (ESD). Statistical designs have desirable statistical properties, but ignore the cost of production, which can be high in some situations. On the other hand, some designers try to minimize production costs in production processes. Such a design is referred to as ED design. Statistical design, however, produces charts that have a high power and a low probability of type Ӏ error, but the cost is disregarded and is higher compared to economic design. On the other hand, EDs focus only on the cost and ignore statistical features. In economic design, the cost of all factors of production and control is applied to a cost function, and then the design parameters are obtained so that this function is minimized.

Paying attention to each of these two schemes without considering the other makes the controller away from an optimal model and a correct decision. To solve this problem, economic-statistical design (ESD) was proposed. In ESD, some statistical restrictions are applied to the economic model to minimize the average cost per unit of time under these conditions. These limitations depend on the opinion and needs of the process designer. Also, combining the optimal features of control charts from an economic dimension, with statistical advantages which are considered in the statistical-economic design, while saving the costs, can also take into account the statistical requirements to maintain product quality.

Duncan (Citation1956) conducted the first study on the economic design of Xˉ control charts. Many researchers, such as (Duncan Citation1971; Lorenzen and Vance Citation1986), and (Banerjee and Rahim Citation1988), followed this approach in their research. But, none of the mere statistical or economic designs was optimal. The former approach had a high cost and the latter lacked statistically desirable features. To solve this problem, Saniga (Citation1989) presented the economic-statistical design (ESD) for Xˉ and R charts. Later on, many researchers, such as (Faraz, Kazemzadeh, and Saniga Citation2010; Prabhu, Montgomery, and Runger Citation1997; Yang and Rahim Citation2005; Zhang and Berardi Citation1997), and (Tavakoli, Pourtaheri, and Moghadam Citation2017) applied this design to control charts. Recently (Ghanaatiyan, Amiri, and Sogandi Citation2017), proposed a modified multivariate weighted moving average control chart based on ESD. Khadem and Bameni Moghadam (Citation2019) studied the economic-statistical design of the Xˉ control chart according to a modification of Banerjee and Rahim’s cost model. To learn about the latest articles on the economic-statistical design of control charts, please refer to (Taji, Farughi, and Rasay Citation2022; Wan Citation2020), and (Heydari, Tavakoli, and Rahim Citation2023).

In conventional economic models, only the fixed values of costs of the process are usually considered and monitored. However, the higher the deviation of the quality characteristic from the target value, the higher the cost of rework or defective product. Combining a loss function with the economic model leads to the conclusion that the greater the deviation from the ideal value, the higher the cost (loss), and the better the decision to continue or stop the process.

Safaei, Baradaran Kazemzadeh, and Niaki (Citation2012) used Taguchi’s loss function in their economic models. They showed the unrealistic results of the no-loss models in the control chart under consideration. In addition, Pasha et al. (Citation2017) examined the effect of the distribution of quality characteristics on the economic model of Banerjee and Rahim, which was combined with Taguchi’s loss function. Also, Pasha et al. (Citation2018)examined the previous study under the economic model of Lorenzen and Vance. In the Xˉ control chart (Celano, Faraz, and Saniga Citation2014), proposed an on-line scheme to monitor the process losses under Taguchi’s approach (Taguchi Citation1979, Citation1986; Taguchi, Elsayed, and Hsiang Citation1988).

Although control charts for exponential characteristics that do not rely on approximations have been studied by Chakraborti et al. (Citation2014; Xie, Ngee Goh, and Ranjan Citation2002; Zhang, Megahed, and Woodall Citation2014), and others, but the economic and economic-statistical designs of an exponential-based control chart are not introduced yet. Therefore, in this paper, we aim to construct these types of control charts. This construction is carried out using four different schemes: statistical, economic, economic-statistical combined with Taguchi’s loss function, and economic-statistical without the application of a loss function. We anticipate that our new methodology, which involves constructing control charts directly using the exponential distribution, will outperform traditional approximation-based methodologies for quality characteristics that follow the exponential distribution.

This paper introduces a novel approach to address the challenge of quality characteristics that follow an exponential distribution, which has significant implications for decision-making in various fields. Unlike existing approaches that rely on approximations, this manuscript proposes the development of control charts specifically designed for exponential characteristics without the need for conversions. This departure from traditional methods is a key innovative insight of this research.

The manuscript presents four different schemes for constructing these control charts: a statistical scheme, an economic scheme, an economic-statistical scheme combined with Taguchi’s loss function, and an economic-statistical scheme without the application of a loss function. The utilization of the artificial bee colony algorithm to determine optimal design parameter values for each scheme further contributes to the novelty of this work.

Additionally, a sensitivity analysis is conducted to investigate the impact of design parameters on each proposed control chart design. The practical implementation of the control charts is demonstrated through a numerical example, showcasing their effectiveness in enhancing decision-making accuracy and reliability in scenarios involving exponentially distributed quality characteristics.

By addressing the limitations of existing approaches and offering new control chart designs tailored to exponential characteristics, this manuscript provides valuable insights and contributes to advancing the field of decision-making in quality control.

Since in each approach we aimed to find the optimal values of the design parameters to meet the desired conditions, the optimization problems were coded in MATLAB by using the artificial bee colony (ABC) algorithm.

In the next section, we obtain control limits for exponentially distributed quality characteristics. In Section 3, we introduce and present the economic design as well as Taguchi’s loss function integrated with the economic model. In the fourth section, we determine the optimal design parameters based on the four considered schemes.

Statistical, Economic, and Economic-Statistical Designs of Control Charts for Exponentially Distributed, Individual Quality Characteristics

Most of the studies conducted on the development of economic models for control charts have assumed the normality of quality characteristics. However, many quality characteristics such as longevity, chemical characteristics, etc., are exponentially distributed. Incidentally, characteristics with these features are important.

The symbols and abbreviations used in this paper are summarized in .

Table 1. Symbols and abbreviations used in the content.

The region between the upper control limit (UCL), abbreviated as k2, and the lower control limit (LCL), represented by k1, is known as the control region. Out of this region is referred to as the out-of-control (action) region. At each period of sampling, a sample of size 1 is taken from the production process at h units of time, which is then compared with the control limits. If the value of the quality characteristic (X) (the taken sample) is located in the control range, the process is considered in the in-control state. Otherwise, the process is considered in the out-of-control state, and the search for the cause of deviation begins. These control limits are obtained by considering the exponential distribution for the quality characteristic X.

Thus, it is assumed that X has an exponential distribution with the probability density function (pdf) given by

fx=θeθx;θ>0,x>0,
which can be represented as X\~Eθ. In the in-control state, X\~Eθ0, and in the out-of-control state, X\~Eθ1, where θ1=δθ0. The parameters θ0 and δ>0 are assumed to be known, while the control limits and sampling intervals should be obtained based on the considered design. In addition, it is assumed that the assignable cause occurs based on a Poisson process with an average of λ observations per hour. In other words, assuming that the process starts in the in-control mode, the length of time the process remains in this state will be an exponential random variable with an average of 1/λ hours.

Statistical Design Based on the Average Run Length (ARL) Approach

In order to statistically compare different control charts, it is common to use the ARL average length criterion. In fact, ARL is the average number of samples required to receive an alarm that the process is out of control. It is clear that when the process is in the in-control state, we expect a large ARL. But when the process is in the out-of-control state, we obtain a more powerful chart that has a smaller ARL compared to the other methods. Accordingly, one of the statistical control methods of the process is the control of ARL. In this method, the in-control ARL (ARL0) is usually fixed (for example, 370) and the controller seeks to find the design parameters so that the out-of-control ARL (ARL1) is minimized. By definition, when the process is in the in-control state, the ARL value can be obtained using the following relation:

ARL0=1α

where α is the type Ӏ error. In the exponential control chart with an individual sample, this is equal to

α=P[X<LCL][X>UCL]|θ=θ0=1eθ0LCL+eθ0UCL.

When the process is in the out-of-control state, ARL is equal to

(1) ARL1=11β.(1)

Here, β is the type ӀӀ error that can be obtained as follows:

β=P([LCL<X<UCL]|θ=θ1)=eθ1LCLeθ1UCL.

Based on this, the level of the type Ӏ error is usually considered to be 0.0027, which yields ARL0=370. Also, in similar problems, we seek to maintain the type ӀӀ error at the level of .75, and as a result, ARL1=4. Therefore, the statistical design of the control chart will be such that ARL0=370 is fixed, and the design parameters (n, h, k1 and k2) are obtained so that ARL1 is less than a preset desired value. Therefore, the optimization process can be written as follows.

MinARL1
s.t.
ARL0=370
(2) β0.75.(2)

This optimization problem was coded in MATLAB and the results were obtained for different values of the input parameters and process shift using the ABC algorithm. The results are presented in .

In order to compare the results of statistical design with other methods of designing control charts, such as economic design and economic-statistical, both with and without the presence of Taguchi loss function, it is necessary to explain that we have provided all the numerical results in Section 4.

A Modified Version of Duncan’s Economic Model

In this section, we discuss one of the most widely studied economic models. In 1956, Duncan published a paper titled “The Economic Design of Xˉ Charts used to Maintain Current Control of a Process.” This article was the main motivator for most of the subsequent research conducted in this field. Duncan found that the selection of parameters such as sample size, control limits, and sampling interval varied at different costs.

If we consider the control chart based on Duncan’s model, then we need to assume that the process starts in the in-control state with the parameter θ0, and that an assignable cause which occurs randomly, changes the distribution of the parameter of quality characteristic (X) from θ0 to θ1=δθ0.

In this model, the cost of eliminating an assignable cause and repairing the process is not deducted from the net income and the process is continuous. This means that the process continues as long as the search for the assignable cause is in progress. shows the quality cycle in Duncan’s economic model.

Figure 1. Quality cycle in Duncan’s economic model.

Figure 1. Quality cycle in Duncan’s economic model.

This quality cycle consists of two periods, namely, the period when the process is in the in-control state, and the period when the process is in the out-of-control state. The process is assumed to be in the in-control state from the beginning of production and continues until an assignable cause occurs, and after this occurrence, the process enters the out-of-control state. This latter state in turn consists of three periods, namely, the time it takes for a reason deviation to occur until the alarm time, the time it takes to sample and check the chart, and the time it takes to detect a reason deviation and correct it. The last two times are shown together in .

The Expected Time of the Quality Cycle in Duncan’s Economic Model

The expected time of the quality cycle for this process is equal to the mathematical expectation of the sum of the four cycles expressed in the quality cycle. In this regard, the expected time for each period can be described as follows.

a- The expected time of the in-control period

Since the time of the in-control state follows the exponential distribution, the expected time of the in-control phase or the average duration of the process in the in-control state is equal to the average exponential distribution, which is equal to 1/λ.

b- The time expected to receive an alarm

Assume that the random variable T represents the time of the occurrence of an assignable cause. In this case, if the samples are taken from the process once every h hour and an assignable cause occurs between the ith and i+1th samples, the average time to observe this reason deviation at this distance is equal to

τ=ETih|ih<T<i+1h
=ihi+1htihft|ih<T<i+1hdt.

For more information, see . According to the definition of the conditional probability density function:

ft|ih<T<i+1h=ftihi+1hftdt.

According to the hypotheses of Duncan’s model, T has an exponential distribution with an average of 1/λ. Therefore, the exact denominator of the above fraction is eλiheλi+1h), which does not depend on t and can be taken out of the integral in the calculation of τ. Therefore,

τ=ihi+1heλihλtihdtihi+1heλihλdt=11+λheλhλ1eλhh2λh212.

When an assignable cause occurs, the probability of finding it in the next sample is

p=PX<LCL\nolimitsX>UCL|θ=θ1.

If the parameter of quality characteristic changes from θ0 to θ1, then it is XEθ1. So, in this case,

p=P(X<LCL|θ1)+P(X>UCL|θ1)
=P(X<LCL|θ1)+1P(X<UCL|θ1)
=1eθ1LCL+11eθ1UCL
=1eθ1LCL+eθ1UCL.

Therefore, the number of samples expected to observe an assignable cause is a geometric random variable with an average of 1/p. So, the expected time to receive an alarm (the average time when the process is in the out-of-control state) is equal to

hph2λh212=h1p12+λh12.

c- The expected time for sample selection and interpretation of the results

The time required to review and interpret an individual sample (single observation) is considered to be g. Therefore, for any process with a sample size of n, the expected time to select a sample and interpret is equal to gn.

d- The expected time for repair

The expected time to detect a deviation in the out-of-control state and correct the process is considered to be d.

Therefore, using the above discussion, the expected time of a cycle can be obtained as follows.

(3) ET=1λ+h1p12+λh12+gn+d.(3)

The Expected Cost of the Quality Cycle in Duncan’s Economic Model

Duncan expressed a simple economic principle for his model: the average net income equals total income minus total cost, in which total income is divided into two parts, namely, income as long as the process is under control V0, and income as long as the process is out of control V1. It also divides the total cost into three parts: the cost of sampling, the false alarm and, process correction and repair. Sampling cost is determined according to the type of product and the number of samples taken at each sampling interval. The cost of a false alarm is the cost of searching for a deviation when no one exists. The cost of correction is the cost of finding a reason for a deviation when there is a deviation. Also, the cost of repair is a cost that, after discovering the cause of the deviation, tries to correct the process and return it to the in-control state. Therefore, net income can be considered as the algebraic sum of income and deduction of costs:

a- The expected income as long as the process is in the in-control state

Since the expected time that the process is under control is equal to 1/λ, the expected income as long as the process is in the in-control state is equal to V0/λ.

b- The expected income as long as the process is out of control

Since the expected time during the out-of-control state for each cycle is

h1p12+λh12+gn+d.

the expected income in the out-of-control period is as follows

V1h1p12+λh12+gn+d.

c- The expected cost of sampling

Since the expected number of sampling times per cycle is equal to ET/h, and the expected cost per sampling time is equal to b+cn, where b is the fixed sampling cost, c is the cost per sampling unit, and n is the number of samples taken each time, the expected cost of sampling in each period can be obtained as follows.

EThb+cn

d- The expected cost of false alarms

This value is equal to the expected number of false alarms per period multiplied by the cost of the false alarm. The expected number of false alarms in each period is calculated by multiplying the probability of false alarms by the expected number of sampling times before the process gets out of control. The expected number of sampling times before the process gets out of control is

i=0iP(ih<T<i+1h).

By considering the exponential distribution for the out-of-control times, this can be calculated as follows.

i=0ihi+1hiλeλtdt=i=0ieihλei+1hλ=1eλhi=0ieihλ=eλh1eλh

As shown by the numerical results of Duncan (Citation1956), if one can ignore expressions with a degree of λ2h2 or higher, the above expression is approximately equal to 1λh. The probability of a false alarm is the probability of the type Ӏ error (α), which can be calculated as follows:

α=P[X<LCL][X>UCL]|θ=θ0=1eθ0LCL+eθ0UCL.

Therefore, the expected number of false alarms in each period is equal to α/λh. Since the cost of each false alarm is y, the expected cost of false alarms is equal to αy/λh.

e- The expected cost of process repair and correction

This value is considered equal to W.

As a result, the expected net income is equal to

(4) ENIC=V0λ+V1h1p12+λh12+gn+dEThb+ncαyλhW.(4)

In addition, the net income per hour can be obtained as follows:

(5) ENIH=ENICET=V0l.(5)

Here, l is a cost loss function that can be obtained using Equationequations (3Equation5) as follows:

l=b+cnh+λMB+αyhλW1+λB.

Herein,

B=h1p12+λh12+gn+d

and

M=V0V1.

Since in the next section we seek to combine the economic model with the loss function, and for this purpose the economic model should be expressed based on the cost and not income, we modified EquationEquation (4) as follows

(6) EC=C0λ+C1h1p12+λh12+gn+d+EThb+nc+αyλh+W.(6)

Here, C0 and C1 are the average costs of the process when the process is in the in-control and out-of-control phases, respectively. Therefore, the average cost of the process per hour is equal to

(7) ECH=ECET.(7)

Accordingly, to perform the economic design of the considered control charts based on Duncan’s economic model, we determine the design parameters in such a way that the cost function ECH is minimized. According to the above discussion, the input parameters of the economic design based on Duncan’s economic model are λ, C0, C1, g, d, c, b, y and W.

This optimization problem was coded in MATLAB and the results for different values of the input parameters and process shift were obtained using the ABC algorithm. The results are shown in .

The Economic-Statistical Design

As expressed in the first section, in the definition of economic-statistical design (ESD), if some statistical restrictions are applied to the economic design, the process design is considered as economic-statistical. In this subsection, we present an economic-statistical design by using the last two subsections: the average run length approach (which is a statistical feature) and a modification of Duncan’s economic model. To do so, we apply the constraints considered in the statistical design to the economic model. This optimization problem can be written as follows:

MinECH
s.t.
ARL0=370
(8) β0.75.(8)

The optimization problem was coded in MATLAB and the results were obtained for different values of the input parameters and process shift using the ABC algorithm. The results are shown in .

The Loss Function and Its Combination with the Cost Model

As seen in the previous section, the cost function was considered based on some fixed values. However, many hidden factors and unforeseen costs may arise during the process. On the other hand, the cost of a defective product depends on how defective it is, which may be remedied with a brief rework or it may turn into waste, which is also variable. To address this weakness, a loss function is selected according to the process and combined with the economic model.

One of the most common loss functions is Taguchi’s loss function, which is based on Taguchi’s definition. In (Taguchi Citation1986), Taguchi put a cost factor (k) in the quadratic loss function called the cost of reworking and scrap cost.

The method of combining the loss function with the cost model is that if we consider LX as the desired loss function and define L0 and L1 as the average costs of production of each defective product in the in-control and out-of-control conditions, then C0 and C1 in relation (4) become C0=n.L0 and C1=n.L1,respectively.Here, n equals the number of products produced per hour, L0=Eθ0LX) and L1=Eθ1LX.

According to Taguchi’s philosophy of loss function, the greater the degree of deviation of the quality characteristic from the ideal value, the higher the social cost of quality. In other words, if we consider the target value for a quality characteristic as ξ, the quality loss becomes zero when the value of the quality characteristic is equal to ξ, and it is clear that the distance from ξ increases the cost.

With this description, Taguchi’s loss function was defined as LX=kxξ2. Herein, k is the loss coefficient which is equal to A/Δ2, wherein Δ is the tolerance limit of x, and A is the cost of reworking or scrapping a product unit. Given that the quality characteristics in the in-control and out-of-control states are exponentially distributed with the probability density function fx=θeθx and the parameters θ0 and θ1=δθ0, respectively, we obtain

L0=0kxξ2fxdx\break=k0x1θ0+1θ0ξ2fxdx\break=kVarX|θ0+1θ0ξ2+21θ0ξEX1θ0\break=kVarX|θ0+1θ0ξ2=k1θ02+1θ0ξ2

Similarly, under the opposite assumption,

L1=kVarX|θ1+1θ1ξ2=k1θ12+1θ1ξ2.

According to what we discussed so far, by rewriting relation (4) based on L0 and L1 we obtain

(9) EIC=n.L0λ+n.L1h1p12+λh12+gn+d+EThb+nc+αyλh+W.(9)

Now, the optimization problem can be written as follows:

MinEIC
s.t.
ARL0=370
(10) β0.75.(10)

This optimization problem was coded in MATLAB and the results were obtained for different values of the input parameters and process shift using the genetic algorithm. The results are presented in .

Sensitivity Analysis and Numerical Results

In this section, we will examine the effect of each input parameter on the designs, particularly focusing on the rate of process shift and its impact on the results. Additionally, we will demonstrate how to implement the designs using a real numerical example.

Optimization Method

The optimization problem of EquationEquations 2, Equation7, Equation8, and Equation10 can be formulated as a nonlinear constrained optimization problem, which is solved using the Artificial Bee Colony (ABC) algorithm to obtain optimal design parameters. In metaheuristic algorithms, only a range of input parameters is considered, and within those ranges, the algorithm seeks to find the best solution for the objective. The algorithm randomly selects values within the specified range and calculates the objective parameter based on those values. Karaboga (Citation2005) introduced ABC, which is inspired by the intelligent behavior of honeybee swarms. ABC consists of different types of bees: employed bees, onlooker bees, and scout bees. Employed bees remain on a food source and store information about its neighborhood. Onlooker bees receive information from employed bees and choose a food source to gather nectar from. Scout bees are responsible for discovering new food sources. The ABC algorithm follows the following procedures:

  1. Initialization by moving the scouts.

  2. Movement of the onlookers.

  3. Scouts move only when the counters of employed bees reach their limits.

  4. Updating the memory.

  5. Checking the termination condition.

For more information on this topic, refer to the works of (Karaboga Citation2005; Karaboga and Akay Citation2009; Karaboga and Basturk Citation2008).

In this article, to solve the optimization problem, the parameters that should be determined in the algorithm are as follows;

The number of colony size (employed bees+onlooker bees) = 20,

The number of food sources equals the half of the colony size = 10,

The number of cycles for foraging (stopping criteria) = 30,

Sensitivity Analysis

To study the effect of each design parameter (c, b, W, g, d, λ, θ0, δ, k, C0, C1, and y) on the designs introduced in the previous sections (statistical, economic, economic-statistical with Taguchi’s loss function, and without a loss function), we considered 48 different combinations of levels for the design parameters. These values are presented in .

Table 2. Different scenarios considered for the input parameters of the design.

In rows 1 to 26 of , different values of input parameters are given. According to these scenarios, statistical and economic results are determined based on statistical design (the average run length approach), economic design (Duncan’s modified economic model), economic-statistical design (based on combining the average run length approach and modified Duncan’s economic model), and the economic-statistical approach integrated with the loss function (Taguchi’s loss function). In rows 27 to 33, we examine the effect of more cases of θ0 values on the results. In rows 34 to 42, several cases of process shift (different values of the process shift coefficient δ) are examined to evaluate the performance of each method under different shifts. In rows 43 to 48, the effect of the cost coefficient of the loss function (k) on the outputs is investigated. In the three initial designs that are not integrated with the loss function, rows 43 to 48 are not evaluated.

It should be noted that in the first row, which is considered as a control, the fixed cost coefficients of the economic model (C0 and C1) and the cost coefficient of the loss function (k) were determined in such a way that when no changes were made in the process (δ=1), the cost of the process was equal in the two approaches (the integrated and non-integrated economic models).

Using the ABC algorithm for EquationEquations 2, Equation7, Equation8, and Equation10, we computed the optimal values of statistical, economic, and economic-statistical combined with Taguchi’s loss function, both with and without applying a loss function, for each row of . The results are presented in , respectively.

Table 3. The optimal values, statistical and economic parameters under the statistical design.

Table 4. The optimal values, statistical and economic parameters under the economic design.

Table 5. The optimal values, statistical and economic parameters under the economic-statistical design.

Table 6. The optimal values, statistical and economic parameters under the economic-statistical design integrated with Taguchi’s loss function.

In , MinARL1 is calculated using EquationEquation 2, and the parameter values of the model for which ARL1 is minimized (i.e., ArgMinh,k1,k2ARL1) are also given in .

displays the values of LOSS=MinECH for economic design, which is calculated using EquationEquation 7. Furthermore, this table also provides the values of economic design parameters for which ECHis minimized (i.e.,ArgMinh,k1,k2ECH).

In the value of ARL1 is calculated using EquationEquation 1.

displays the values of LOSS=MinECH for economic-statistical design, which is calculated using EquationEquation 8. Furthermore, this table also provides the values of the design parameters for which ECHis minimized (i.e.,ArgMinh,k1,k2,ARL1ECH).

displays the values of LOSS=MinEIC for economic-statistical design integrated with Taguchi’s loss function, which is calculated using EquationEquation 10. Furthermore, this table also provides the values of the design parameters for which EICis minimized (i.e.,ArgMinh,k1,k2,ARL1EIC).

According to the obtained values, by drawing the results of the four considered approaches against each other, we compared the performance of these approaches.

shows the values obtained for the statistical parameter (ARL1) in the statistical, economic and economic-statistical schemes. As can be seen, the value of ARL1 in the statistical design (which was about one, and indeed very desirable) was less than the economic and statistical-economic designs in all the 42 studied cases. This means that in the first step, after the shift to out of control, an alarm will be received. The value of ARL1 in the economic-statistical design was close to the considered limit (equal to 4), a desirable value in control charts. This statistical feature was close to 9 in the economic design, which changed in cases related to the different values of θ0. It will be decreased by decreasing θ0.

Figure 2. Values obtained for the statistical parameter (ARL1) in the statistical, economic and economic-statistical designs.

Figure 2. Values obtained for the statistical parameter (ARL1) in the statistical, economic and economic-statistical designs.

In , we compare the amount of cost in the three considered designs. As can be seen, unlike , the economic design had a better performance than the statistical design. The decrease and increase of costs in points 23 and 24 were related to the decrease and increase of in-control costs in the two cases examined in . The amount of cost in the economic-statistical design in almost all cases was consistent with the values of the economic design. This means that there is no significant difference between the economic and statistical-economic designs in terms of economic characteristics.

Figure 3. Values obtained for the economic parameter (cost) in the statistical, economic and economic-statistical designs.

Figure 3. Values obtained for the economic parameter (cost) in the statistical, economic and economic-statistical designs.

Therefore, considering the acceptable statistical results in the economic-statistical design and the desirable economic results in this plan, the economic-statistical design can be a desirable design for constructing the desired control chart, which is recommended.

compare the statistical and economic performance of the economic-statistical design combined with Taguchi’s loss function with those of the non-integrated design.

Figure 4. Values obtained for the statistical parameter (ARL1) in the economic-statistical design combined with Taguchi’s loss function and the design without application of the loss function.

Figure 4. Values obtained for the statistical parameter (ARL1) in the economic-statistical design combined with Taguchi’s loss function and the design without application of the loss function.

Figure 5. Values obtained for the economic parameter (cost) in the economic-statistical design combined with Taguchi’s loss function and the design without application of the loss function.

Figure 5. Values obtained for the economic parameter (cost) in the economic-statistical design combined with Taguchi’s loss function and the design without application of the loss function.

As can be seen in , the difference between the statistical performance of the two cases under study is not considerable, and the results are almost identical. Therefore, as expected, the application of the loss function does not change the statistical property of the control chart.

compares the economic performance of the two approaches, and reveals a significant difference in this property. At almost all levels, the cost of the process was reduced by applying the loss function, and in a small number of cases, the cost was increased.

Our prediction is that the increase in costs is due to changes in the rate of process shift (the shift coefficient δ) and the value of the distribution parameter (θ0). To investigate this issue and find the main causes of these changes, we drew the economic results in the two cases (the economic-statistical design combined with Taguchi’s loss function and the design without application of the loss function) against different values of δ and θ0 in .

Figure 6. The effect of changes in θ0 and δ on the values obtained for the economic parameter (cost) in the economic-statistical design combined with Taguchi’s loss function and the design without application of the loss function.

Figure 6. The effect of changes in θ0 and δ on the values obtained for the economic parameter (cost) in the economic-statistical design combined with Taguchi’s loss function and the design without application of the loss function.

First, it was observed that the economic-statistical design without application of the loss function was insensitive to changes in the parameters of quality characteristic distribution (θ0) and process shift (δ). However, in practice, what is expected and desirable is a direct relationship between the cost and the deviation of the quality characteristic from a target. This was considered in the definition of the loss function (L0 and L1) and EquationEquation (5). In the integrated model, increasing θ0 and δ had a direct effect on the cost of the process.

An Illustrative Example

Lifetime may be regarded as a quality characteristic that requires control. An instance of this is when the occurrence of defects in a process is small. In such scenarios, the C-chart and U-chart could not be used for defect counts. Therefore, instead of employing them, the intervals between successive defect occurrences are taken into account, and a control chart is constructed for these durations (for further examination, refer to (Montgomery Citation2020)). To fit such data, one of the best distributions that can be considered is the exponential distribution.

In this section, a distinct illustration of lifetime (regarded as a quality characteristic) is introduced. The durability of certain items like car seats, office chairs, and baby inflatable chairs relies on the applied pressure. As weight increases, which is linked to pressure, the lifespan of a seat cushion decreases. The subsequent data pertains to the duration of child’s inflatable seats produced in a workshop, measured in hours for weights exceeding 80 kg ().

Table 7. Lifespan of the child inflatable seat’s resistance to the weight placed on it.

We performed the Anderson–Darling goodness-of-fit test on the Lifetime dataset from . Based on the Anderson–Darling statistic and p-values of the goodness-of-fit test, it can be observed that, at a significance level of .05, the values conform to an Exponential distribution (Anderson–Darling test statistic = .512, for N = 30, and p-value = .473 > .05). However, the assumption of Normality is rejected (Anderson–Darling test statistic = 1.116, for N = 30, and p-value = .005 < .05). Therefore, for these data, using conventional control charts that are based on the assumption of a normal distribution of the quality characteristic is by no means appropriate. Hence, it is better to use the proposed designs in this article for this purpose.

Given that the mean and standard deviation of the lifetime values are nearly 1 and .87, respectively, we can proceed to fit an Exponential distribution to the data using a mean of θ0=1.

If we assume that the values of the model parameters are as follows: c=2.3, b=30, W=400, g=20, d=50, λ=0.01, θ0=1, δ=0.8, k=100, C0=50, C1=1000, and y=100, according to this information, the statistical, economic, economic-statistical with Taguchi’s loss function, and control charts without a loss function designed on an exponential base would be as follows (refer to ).

Table 8. Results obtained for numerical example.

Based on , if a person does not want to use economic design for any reason, then the alternative is to use statistical design. The best option in this case would be to take samples of size 27 from the process every 5.35 hours, assuming the LCL = .11 and the UCL = 1.54. In this scenario, if the process distribution parameter (θ0), changes from 1 to .8, this chart will detect the change on average after approximately 1.03 sampling iterations.

Also, if a person wants to use economic design, in this case, considering that the economic statistical design with the Taguchi loss function has the minimum LOSS compared to other designs, the best option is to take 36 samples of the process every 7.92 hours. Consider LCL equal to .18 and UCL equal to 2.44. In this case, if θ0 changes from 1 to .8, this chart will detect the change after an average of 4 sampling rounds, and the average cost per hour of the process will be $554.29.

Conclusion

In the construction of control charts, the distribution of quality characteristics is usually considered to be normal. However, this assumption is not valid for many datasets, especially those with high skewness. In such cases, a suitable alternative is the exponential distribution. This distribution is used in many fields, including reliability engineering, queueing theory, telecommunications, physics, finance, biology, and epidemiology.

Therefore, given the importance and applications of the subject, contrary to what has been done in the past (using approximations to normalize the characteristics, which leads to errors), this paper constructs a control chart based on economic-statistical design directly, without using any approximations.

According to the useful results of applying loss functions to economic models, Taguchi’s loss function was combined with the economic model and the output parameters were obtained under four approaches, namely, statistical, economic, economic-statistical integrated with the aforementioned loss function, and economic-statistical without application of the loss function. The utilized statistical design was based on controlling the average run length of the process, and Duncan’s modified economic model was used to construct the economic and economic-statistical designs. The results showed that the economic-statistical design had quite favorable economic and at the same time statistically acceptable results. Additionally, the results confirmed the initial assumption that the application of the loss function makes the calculation of cost in the control chart more accurate and actual, as it is clear that the greater the offset of the quality characteristic from the ideal value, the higher the cost of the process. This feature can be established only by applying the loss function. Utilizing the contents of this article and building a control process based on it can be useful for industries with exponential characteristics.

Also, there are other distributions that can be considered for further improvements. Some of these distributions include:

  • Gamma distribution: This distribution is often used to model the time between events or the size of certain objects.

  • Log-normal distribution: This distribution is often used to model variables that are positive and skewed, such as income or prices.

  • Weibull distribution: This distribution is often used to model failure times in reliability analysis.

  • Beta distribution: This distribution is often used to model proportions or probabilities.

By considering these different distributions, one may be able to improve the accuracy of control charts and better detect any deviations from the expected process behavior. However, it is important to note that selecting the appropriate distribution requires careful consideration of the underlying process and the characteristics of the data being analyzed.

Author Contributions

Conceptualization, M.T. and A.A.H.; Data curation, S.H.Z.A.-T.; Formal analysis, M.T.; Investigation, A.A.H.; Methodology, M.T. and A.A.H.; Software, M.T. and A.A.H.; Visualization, S.H.Z.A.-T.; Writing – original draft, S.H.Z.A.-T.; Writing – review and editing, M.T. and A.A.H. All authors have read and agreed to the published version of the manuscript.

Disclosure Statement

No potential conflict of interest was reported by the author(s).

Data Availability Statement

The authors confirm that the data supporting the findings of this study are available within the article.

Additional information

Funding

This research received no external funding.

References

  • Al-Nuaami, W. A. H., A. Akbar Heydari, and H. Jabbari Khamnei. 2023. The poisson–Lindley distribution: Some characteristics, with its application to SPC. Mathematics 11 (11):2428.
  • Aslam, M., M. Azam, N. Khan, and C.-H. Jun. 2015. A control chart for an exponential distribution using multiple dependent state sampling. Quality & Quantity 49:455–32.
  • Bai, D. S., and I. S. Choi. 1995. X and R control charts for skewed populations. Journal of Quality Technology 27 (2):120–31.
  • Banerjee, P. K., and M. A. Rahim. 1988. Economic design of–control charts under Weibull shock models. Technometrics 30 (4):407–14.
  • Celano, G., A. Faraz, and E. Saniga. 2014. Control charts monitoring product’s loss to society. Quality and Reliability Engineering International 30 (8):1393–407.
  • Chakraborti, S., N. Kumar, A. C. Rakitzis, and R. S. Sparks. 2014. Time between events monitoring with control charts. Wiley StatsRef: Statistics Reference Online 1–13.
  • Choobineh, F., and J. L. Ballard. 1987. Control-limits of QC charts for skewed distributions using weighted-variance. IEEE Transactions on Reliability 36 (4):473–77.
  • Duncan, A. J. 1956. The economic design of X charts used to maintain current control of a process. Journal of the American Statistical Association 51 (274):228–42.
  • Duncan, A. J. 1971. The economic design of-charts when there is a multiplicity of assignable causes. Journal of the American Statistical Association 66 (333):107–21.
  • Faraz, A., R. B. Kazemzadeh, and E. Saniga. 2010. Economic and economic statistical design of T 2 control chart with two adaptive sample sizes. Journal of Statistical Computation and Simulation 80 (12):1299–316.
  • Figueiredo, F., and M. Ivette Gomes. 2013. The skew-normal distribution in SPC. REVSTAT-Statistical Journal 11 (1):83–104–83–104.
  • Ghanaatiyan, R., A. Amiri, and F. Sogandi. 2017. Multi-objective economic-statistical design of VSSI-MEWMA-DWL control chart with multiple assignable causes. Journal of Industrial and Systems Engineering 10:34–58. special issue on Quality Control and Reliability).
  • Heydari, A. A., M. Tavakoli, and A. Rahim. 2023. An integrated model of maintenance policies and economic design of X-bar control chart under burr XII shock model. Iranian Journal of Science 1–9.
  • Karaboga, D. 2005. An idea based on honey bee swarm for numerical optimization. Technical report-tr06, Erciyes university, engineering faculty, computer
  • Karaboga, D., and B. Akay. 2009. A comparative study of artificial bee colony algorithm. Applied Mathematics and Computation 214 (1):108–32.
  • Karaboga, D., and B. Basturk. 2008. On the performance of artificial bee colony (ABC) algorithm. Applied Soft Computing 8 (1):687–97.
  • Khadem, Y., and M. Bameni Moghadam. 2019. Economic statistical design of X‾-control charts: Modified version of Rahim and Banerjee (1993) model. Communications in Statistics-Simulation and Computation 48 (3):684–703.
  • Kumar, N. 2022. Statistical design of phase II exponential chart with estimated parameters under the unconditional and conditional perspectives using exact distribution of median run length. Quality Technology & Quantitative Management 19 (1):1–18.
  • Kumar, N., S. Chakraborti, and P. Castagliola. 2022. Phase II exponential charts for monitoring time between events data: Performance analysis using exact conditional average time to signal distribution. Journal of Statistical Computation and Simulation 92 (7):1457–86.
  • Lorenzen, T. J., and L. C. Vance. 1986. The economic design of control charts: A unified approach. Technometrics 28 (1):3–10.
  • Montgomery, D. C. 2020. Introduction to statistical quality control. Hoboken, New Jersey: John Wiley & Sons.
  • Morales, V. H., and C. Arturo Panza. 2022. Control charts for monitoring the mean of skew-normal samples. Symmetry 14 (11):2302.
  • Nelson, L. S. 1994. A control chart for parts-per-million nonconforming items. Journal of Quality Technology 26 (3):239–40.
  • Pasha, M. A., M. Bameni Moghadam, S. Fani, and Y. Khadem. 2018. Effects of quality characteristic distributions on the integrated model of Taguchi’s loss function and economic statistical design of-control charts by modifying the Banerjee and Rahim economic model. Communications in Statistics-Theory and Methods 47 (8):1842–55.
  • Pasha, M. A., M. Bameni Moghadam, Y. Khadem, and S. Fani. 2017. An integration of Taguchi’s loss function in Banerjee–Rahim model for the economic and economic statistical design of-control charts under multiple assignable causes and Weibull shock model. Communications in Statistics-Theory and Methods 46 (24):12113–29.
  • Prabhu, S. S., D. C. Montgomery, and G. C. Runger. 1997. Economic-statistical design of an adaptive X chart. International Journal of Production Economics 49 (1):1–15.
  • Safaei, A. S., R. Baradaran Kazemzadeh, and S. T. A. Niaki. 2012. Multi-objective economic statistical design of X-bar control chart considering Taguchi loss function. International Journal of Advanced Manufacturing Technology 59:1091–101.
  • Saniga, E. M. 1989. Economic statistical control-chart designs with an application to and R charts. Technometrics 31 (3):313–20.
  • Santiago, E., and J. Smith. 2013. Control charts based on the exponential distribution: Adapting runs rules for the t chart. Quality Engineering 25 (2):85–96.
  • Taguchi, G. 1979. Introduction to off-line quality control. Central Japan Quality Control Assoc.
  • Taguchi, G. 1986. Introduction to quality engineering: Designing quality into products and processes. Tokyo, Japan.
  • Taguchi, G., E. A. Elsayed, and T. C. Hsiang. 1988. Quality engineering in production systems. McGraw-Hill Companies. New York, USA: McGraw-Hill College.
  • Taji, J., H. Farughi, and H. Rasay. 2022. Economic-statistical design of fully adaptive multivariate control charts under effects of multiple assignable causes. Computers & Industrial Engineering 173:108676.
  • Tavakoli, M., and A. A. Heydari. 2021. Control chart for exponential individual samples with adaptive sampling interval method based on economic statistical design: An extension of costa and Rahim’s model. Communications in Statistics-Theory and Methods 52 (14):4993–5009.
  • Tavakoli, M., R. Pourtaheri, and M. B. Moghadam. 2017. Economic and economic–statistical designs of VSI bayesian control chart using Monte Carlo method and ABC algorithm. Journal of Statistical Computation and Simulation 87 (4):766–76.
  • Wan, Q. 2020. Economic-statistical design of integrated model of VSI control chart and maintenance incorporating multiple dependent state sampling. Institute of Electrical and Electronics Engineers Access 8:87609–20.
  • Xie, M., T. Ngee Goh, and P. Ranjan. 2002. Some effective control chart procedures for reliability monitoring. Reliability Engineering & System Safety 77 (2):143–50.
  • Yang, S. F., and M. A. Rahim. 2005. Economic statistical process control for multivariate quality characteristics under Weibull shock model. International Journal of Production Economics 98 (2):215–26.
  • Zhang, G., and V. Berardi. 1997. Economic statistical design of X control charts for systems with Weibull in-control times. Computers & Industrial Engineering 32 (3):575–86.
  • Zhang, M., F. M. Megahed, and W. H. Woodall. 2014. Exponential CUSUM charts with estimated control limits. Quality and Reliability Engineering International 30 (2):275–86.