Related Links:
R Explained Page
CUSUM Charts Explained Page
Introduction
Example R Code
Example Explained
X
Y
This page provides explanations and example R codes for CUSUM quality control charts, for detecting changes in the variance of values where the measurements are normally distributed.
CUSUM Generally
CUSUM is a set of statistical procedures used in quality control. CUSUM stands for Cumulative Sum of Deviations.
In any ongoing process, be it manufacture or delivery of services and products, once the process is established and running, the outcome should be stable and within defined limits near a benchmark. The situation is said to be In Control
When things go wrong, the outcomes depart from the defined benchmark. The situation is then said to be Out of Control
In some cases, things go catastrophically wrong, and the outcomes departure from the benchmark in a dramatic and obvious manner, so that investigation and remedy follows. For example, the gear in an engine may fracture, causing the machine to seize. An example in health care is the employment of an unqualified fraud as a surgeon, followed by sudden and massive increase in mortality and morbidity.
The detection of catastrophic departure from the benchmark is usually by the Shewhart Chart, not covered on this site. Usually, some statistically improbable outcome, such as two consecutive measurements outside 3 Standard Deviations, or 3 consecutive measurements outside 2 Standard Deviations, is used to trigger an alarm that all is not well.
In many instances however, the departures from outcome benchmark are gradual and small in scale, and these are difficult to detect. Examples of this are changes in size and shape of products caused by progressive wearing out of machinery parts, reduced success rates over time when experienced staff are gradually replaced by novices in a work team, increases in client complaints to a service department following a loss of adequate supervision.
CUSUM is a statistical process of sampling outcome, and summing departures from benchmarks. When the situation is in control, the departures caused by random variations cancel each other numerically. In the out of control situation, departures from benchmark tend to be unidirectional, so that the sum of departures accumulates until it becomes statistically identifiable.
The mathematical process for CUSUM is in 2 parts. The common part is the summation of depertures from the bench mark (CUSUM, and graphically demonstrating it. The unique part is the calculation of the decision interval abbreviated as DI or h, and the reference value, abbreviated as k, which continuously adjustes the CUSUM and itd variance. The two values of h and k depends on the following parameters
- The in control values
- The out of control values
- The Type I Error or false positive rate, expressed as the Average Run Length, abbreviated as ARL, the number of samples expected for a false positve decision when the situation is in control. ARL is the inverse of false positive rate. A false positive rate of 1% would have ARL=100
CUSUM for Normally Distributed Variance
Quality control using variance is used under two circumstances. Firstly, it is argued that it should be used in conjunction with CUSUM for mean values, as any changes in variance may affect the mean values. Secondly, in some situation, changes in variations itself is important. An example is in quality control of packaging products such as food, where the amount should be controlled with discipline, as too much reduces profit and too little results in rejection. In this situation, both the mean and the variance requires continuous quality control.
The terms variance and Standard Deviation are both used when considering CUSUM for variance. Standard Deviation is the square root of variance, and statisticians often use both interchangeably. In CUSUM for variance, the input parameters and data are in Standard Deviation units. All calculations and results however are in variance units, and users need to distinguish the two so as not to be confused.
The parameters required are
- The sample size (ssiz) used to calculate each Standard Deviation value
- The Standard Deviation of the measurement when the situation is in control.
- The Standard Deviation when the situation is out of control. This is not so much the expected value, but the departure that is big enough to warrant investigation and intervention
- The Average Run Length (ARL). This depends on a balance between the importance of detecting deviation against the cost of disruption in case of a false positive. Please note: that the algorithm on this page is intended for a one tail monitoring, either an increase or a decrease in the value. If the user intends a two tail monitoring, to detect either increase of decrease, the two CUSUM charts should be created, each with half the ARL that of a one tail CUSUM.
Details of how the analysis is done and the results are describer in the panel R Code Explained
References
CUSUM : Hawkins DM, Olwell DH (1997) Cumulative sum charts and charting for
quality improvement. Springer-Verlag New York. ISBN 0-387-98365-1 p 47-74, 141-142
Hawkins DM (1992) Evaluation of average run lengths of cumulative sum charts for an arbitrary data distribution. Journal Communications in Statistics - Simulation and Computation Volume 21, - Issue 4 Pages 1001-1020
https://cran.r-project.org/web/packages/CUSUMdesign/index.html
https://cran.r-project.org/web/packages/CUSUMdesign/CUSUMdesign.pdf
This section presents the R code in total so it can be copied and pasted into RStudio as a template. Detailed explanations for each step are in the next panel
For those unfamiliar with R, basic information on setting up R and common R procedures can be found in the R Explained Page
# Normal Variance
# Step 1: parameters and data
inControlSD = 10
outOfControlSD = 10.5
ssiz = 10
arl = 100
theModel = "F" #F for FIR, Z for zero, S for steady state
dat=c(
10.3,9.6,8.7,12.4,10.5,11.5,7.2,10.2,7.9,10.3,11.4,10.7,11.1,10.5,8,10.4,9.6,
9.5,12.8,12,7.2,10.9,9,8.8,9.4,10.5,8,9.7,12.4,9.3,7.2,8.2,8,6.6,13.5,9.2,
9.8,11.5,9.6,8.6,8.6,9.9,12.6,10.7,10.6,10.5,10.1,10.3,7.8,9.9,10.5,12.7,9.5,
9.6,10.6,12.7,8.8,8.9,7.9,10.9,10.2,12.3,7.5,13.6,9,9,11.6,13.1,7.6,9.3,10,
11.2,8.9,8.8,10.2,10.7,12.3,8.7,9.4,9.6,11.3,12.3,10.2,10.9,10.9,10.3,10.6,10.7,
13.6,9.7,12.3,12.8,9,11.3,9,10.2,11.6,8.2,10.8,9.9
)
# Step 2: Calculate k and h
#install.packages("CUSUMdesign") # if not already installed
library(CUSUMdesign)
result <- getH(distr=2, ICsd=inControlSD, OOCsd=outOfControlSD, samp.size=ssiz, ARL=ssiz, type=theModel)
k = result$ref
h = result$DI
if(outOfControlSD<inControlSD)
{
h = -h
}
cat("Reference Value k=",k,"\tDecision Interval h=", h, "\n")
# Step 3 Calculate and plot CUSUM
# Step 3a: Calculate CUSUM values
cusum <- vector()
cusumValue = 0
if(theModel=="F")
{
cusumValue = h / 2
}
for(i in 1 : length(dat))
{
cusumValue = cusumValue + dat[i]^2 - k # SD->variance
if(outOfControlSD>inControlSD) # Up
{
if(cusumValue<0)
{
cusumValue = 0
}
}
else # down
{
if(cusumValue>0)
{
cusumValue = 0
}
}
cusum[i] = cusumValue
}
# Step 3b: Plot CUSUM
plot(cusum,type="l")
abline(h=h)
# Step 4: Optional export of results
#myDataFrame <- data.frame(dat,cusum) #combine dat and cusum to dataframe
#myDataFrame #display dataframe
#write.csv(myDataFrame, "CusumVariance.csv") # write dataframe to .csv file
The example is a made up one to demonstrate the numerical process, and the data is generated by the computer. It purports to be from a quality control exercise in the manufacturing of ball bearings.
- When everything is working normally, we expect our ball bearing to weigh 100g, with a Standard Deviation of 10g.
- The machine making the ball bearing however wears out, and when this happens, the variations in the weight of ball bearings progressively widens. We would like to trigger an alarm and service the machinery when the Standard Deviation of the ball bearing become 10.5 or more, an increase of 0.5 from Standard Deviation=10.
- We estimate the Standard Deviations 5 times a day, using 10 measurements each time to calculate the Standard Deviation. and we do not want a false alarm more frequent than every 20 days, so out average run length ARL = 5x20 = 100
The data is entered in step 1. This is the only part of the program that needs any editing
# Step 1: parameters and data
inControlSD = 10
outOfControlSD = 10.5
ssiz = 10
arl = 100
theModel = "F" #F for FIR, Z for zero, S for steady state
dat=c(
10.3,9.6,8.7,12.4,10.5,11.5,7.2,10.2,7.9,10.3,11.4,10.7,11.1,10.5,8,10.4,9.6,
9.5,12.8,12,7.2,10.9,9,8.8,9.4,10.5,8,9.7,12.4,9.3,7.2,8.2,8,6.6,13.5,9.2,
9.8,11.5,9.6,8.6,8.6,9.9,12.6,10.7,10.6,10.5,10.1,10.3,7.8,9.9,10.5,12.7,9.5,
9.6,10.6,12.7,8.8,8.9,7.9,10.9,10.2,12.3,7.5,13.6,9,9,11.6,13.1,7.6,9.3,10,
11.2,8.9,8.8,10.2,10.7,12.3,8.7,9.4,9.6,11.3,12.3,10.2,10.9,10.9,10.3,10.6,10.7,
13.6,9.7,12.3,12.8,9,11.3,9,10.2,11.6,8.2,10.8,9.9
)
Step 1 contains the parameters and the data. This is the part where the user can edit, and change the values to that required in his/her own analysis.
The first 4 lines sets the parameters required for the analysis.
Please note: although the CUSUM is for variance, the parameters entered here are the in control and out of control Standard Deviations.
The 5th line, the model has 3 options, which sets the first value of the CUSUM
- F means Fast Initial Response, where the initial CUSUM value is set at half of the Decision Interval h. The rationale is that, if the situation is in control then CUSUM will gradually drift towards zero, but if the situation is already out of control, an alarm would be triggered early. The down side is that a false alarm is slightly more likely early on in the monitoring. As FIR is recommended by Hawkins, it is set as the default option
- Z is for zero, and CUSUM starts at the baseline value of 0. This will lower the risk of false alarm in the early stages of monitoring, but will detect the out of control situation slower if it already exists at the begining.
- S is for steady state, intended for when monitoring is already ongoing, and a new plot is being constructed. The CUSUM starts at the value when the previous chart ends.
- Each model will make minor changes to the value of the decision interval h. The setting of the initial values is mostly intended to determine how quickly an alarm can be triggered if the out of control situation exists from the beginning.
The last part, c() is a function that creates a vector (array) containing the comma separated values within the bracket. In this case, dat is the name of the vector which contains 80 measurements of ball bearing Standard Deviations, each value is calculated from the weights of 10 ball bearings
The remainder of the program does not require any editing or change by the user, unless he/she wishes to alter the program for specific purposes.
# Step 2: Calculate k and h
#install.packages("CUSUMdesign") # if not already installed
library(CUSUMdesign)
result <- getH(distr=2, ICsd=inControlSD, OOCsd=outOfControlSD, samp.size=ssiz, ARL=ssiz, type=theModel)
k = result$ref
h = result$DI
if(outOfControlSD<inControlSD)
{
h = -h
}
cat("Reference Value k=",k,"\tDecision Interval h=", h, "\n")
Step 2 performs that statistical calculations using the parameters entered. The package CUSUMdesign needs to be alrady installed, and the library activated each time the program is used.
result is the object that contains the results of the analysis. The result required for this program are the reference value (k) and decision interval h. Please note that h is calculated as a positive value. If the CUSUM is designed to detect a decrease from in control value, then h needs to be changed to a negative value.
The last line displays the results we need
Reference Value k= 104.9584 Decision Interval h= 87.97868
Please Note: that, although the input parameters were in Standard Deviations, k and h are now in variance units, and variance is the square of Standard Deviation. All subsequent output are in terms of variance, and not Standard Deviation.
Step 3 is divided into 2 parts. Step 3a calculates the cusum vector, and 3b plots the vector and h in a graph.
# Step 3a: Calculate CUSUM values
cusum <- vector()
cusumValue = 0
if(theModel=="F")
{
cusumValue = h / 2
}
for(i in 1 : length(dat))
{
cusumValue = cusumValue + dat[i]^2 - k # SD->variance
if(outOfControlSD>inControlSD) # Up
{
if(cusumValue<0)
{
cusumValue = 0
}
}
else # down
{
if(cusumValue>0)
{
cusumValue = 0
}
}
cusum[i] = cusumValue
}
The first 6 lines of code in step 3a creates the empty cusum vector and sets the initial cusum value. The remaining codes calculates the cusum value for each measurement, and places it in the cusum vector.
Please Note: The data entered are Standard Deviations, each value is squared to become variance before the CUSUM value is calculated
# Step 3b: Plot the cusum vector and h
plot(cusum,type="l")
abline(h=h)
|
In step 3b, the first line plots the cusum vector, and the second line the decision interval h. The result plot is shown to the left.
|
# Step 4: Optional export of results
#myDataFrame <- data.frame(dat,cusum) #combine dat and cusum to dataframe
#myDataFrame #display dataframe
#write.csv(myDataFrame, "CusumVariance.csv") # write dataframe to .csv file
Step 4 is optional, and in fact commented out, and included as a template only. Each line can be activated by removing the #
The first line places the two vectors, dat and cusum together into a dataframe
The second line displays the data, along with row numbers, in the console, which can then be copied and pasted into other applications for further processing
The third line saves the dataframe as a comma delimited .csv file. This is needed if the data is too large to handle by copy and paste from the console.
Please note: R Studio write files to User/Document/ folder by default. The path needs to be reset if the user wishes to save files using a specific folder. Discussion in the file I/O panel of the R Explained Page
<
|