Quality engineering requires systematic
experimentation with carefully developed prototypes whose performances are
tested in actual field conditions. The objective of quality engineering is
to discover optimum values of various design parameters to ensure the
consistent performance of the product / process in actual use. In the past,
the concept of quality has undergone different paradigm shift to accommodate
various end-use characteristics. However, the focus of all those concepts
has remained towards the customer satisfaction.
Measuring a fraction of products outside
the specified limits, a measure of the quality loss due to scrap, miserably
fails as a predictor of customer satisfaction. Reducing the specification
limits, improving the on-line quality control to bring the specified units
closer to the target and inspection of more samples in order to find the
defective products before they reach the customers have been widely
practiced in controlling the quality of various products. But such exercises
are, often, not considered as good options since they address the symptoms
and not the root cause of the problems.
Many experimental designs have been
demonstrated in the field of experimental statistics for characterisation
and optimisation of various design factors. Typically, the performance of
any process or product is affected by a multitude of factors and over 70% of
the malfunctions are traceable to the design factors. Since every
performance can not be predicted by theory, experimentation or prototyping
is resorted for empirical optimisation of the various process and products
Quality in Taguchi’s
Genichi Taguchi, a Japanese engineer,
studied various methods of Design of Experiments at Indian Statistical
Institute in 1950s and later applied in a very creative manner to improve
the product and process designs. In 1982, American Supplier Institute first
introduced Taguchi methods in USA and his methods were popularised by
Madhave Phadke and Raghu Kacher of the Bell Laboratories. Taguchi defines
the quality as “the losses a product imparts to the society from the time
it is supplied”. Taguchi’s contributions to quality engineering include
loss function associated with a product or process, robust design and
simplified statistical experiments using orthogonal arrays.
Taguchi methodology states that even the
best available manufacturing technology by itself is not an assurance that
the final product will, actually, function in the hands of its users as
desired and so strongly advocated for the engineered products with robust
performance [1, 2, 3]. Taguchi described entire concept in two basic ideas,
namely, quality should be measured by the deviation from a special target
value, rather than by conformance to preset tolerance limits and quality
cannot be ensured through the inspection and rework, but must be built-in
through the appropriate design of the process and product. The first concept
underlines the basic difference between Taguchi methods and statistical
process control (SPC) methodology. While SPC emphasises the attainment of an
attribute within tolerance range, Taguchi methods emphasise the attainment
of the specified target value and the elimination of variation.
The essence of loss function advocated
by Taguchi can be stated as follows: deviation of a product from the target
performance generates loss to the society that varies with reference to the
extent of variation. The loss is proved to be minimum when performance
coincides with target and increases gradually as it deviates from the
target. The Taguchi’s loss function establishes a financial measure of the
user dissatisfaction with a product’s performance as it deviates from a
target value that is often, overlooked by other experimental designs [ 3,
Taguchi believes that the customer
becomes increasingly dissatisfied as the performance departs farther away
from the target. According to Taguchi, the cost of quality in relation with
the deviation from the target is not linear because the customer’s
frustration increases at a faster rate as more defects are found on a
particular product and a quadratic curve to represent the customer’s
dissatisfaction with a product’s performance (Figure 1). Based on the
Taylor series approximation, the loss function increases as the quality
characteristics deviates on either sides of the target value [4,5].
Figure 1. Taguchi’s Loss Function
If y is the performance characteristic
measures on a continuous scale with the target performance level t, then
loss caused [L(Y)] is given by a quadratic function : Where delta
is cost of a defective product and m is LSL-T or
T-USL. The average loss is
proportional to the mean squared error of Y about its target value t. The
loss function for n products is given by
Thus the average loss, caused by
variability has two components.
1.The average performance (µ), being different from the target t,
contributes the loss k (µ- t)2.
2.The loss ks2 results from the performance (yi) of the individual items
being different from their own average µ.
The ideal condition is µ = t and s2 = 0. The quadratic loss function
provides the necessary information (S/N) to achieve effective quality
Design and robust
The basic scientific knowledge allows
the designer to guide the design, with suitable design parameters that
ensure good performance [1, 5, 6, 7, 8]. Design of a product or a process,
essentially, involves various stages like concept design, parameter design
and tolerance design, each addressing different set of aspects (Table 1).
Taguchi developed a strategy for quality
engineering that can be used in the product design and its manufacturing
process. In conventional DoE, variation between experimental replications is
a nuisance that the experimenter like to eliminate, whereas in the Taguchi
method, they are the central object of investigation. Taguchi’s method
replicates each experiment with the aid of an outer array that deliberately
include the sources of variation that a product would come across while in
service. Such a design is called a minimum sensitivity design or a robust
design and the Robust Design method is called Taguchi method .
Table 1. Scope and Significance of
Type of Design
consists of choosing the product or service to be
produced and defining its structural design and production process
that will be used to generate it
Determines intended use of the product, its basic functions,
materials needed to produce the selected product and process needed
production of a product starts with the concept
Involves selecting the best combination of control
factors that would optimize the quality level of the product by
reducing the product’s sensitivity to noise factors
The combination of the control factors must be
optimal while the effect of the noise factors must be so minimal that
they will not have any negative impact on the functionality of the
The experiment that leads to the optimal result
will require the identification of the noise factors because they are
part of the process and their effects need to be controlled.
Tolerance design must be used for all parts of a
product to limit the possibility of producing defective products. They
are set after the testing and experimentation.
To overcome the limitations of the parameter design
since it may not completely eliminate the variation from the target.
The tolerance around the target is usually set by
the design engineers; it is defined as the range within which
variation may take place.process of balancing the cost is called
Robust design can greatly
reduce the off target performance caused by poorly controlled manufacturing
conditions, temperature or humidity shifts, wider component tolerances used
during fabrication and also field abuse that might occur due to adverse
service conditions [1, 5, 9].
Approaches for Taguchi method
Four Step Approach
Eight Step Approach
This step consists of identifying, developing the
P-diagram, defining ideal function, S/N ratio and planning the
experiments. The experiment involves changing the control, noise and
signal factors systematically using OAs.
Data Collection / Simulation
The experiments may be conducted in hardware or
through simulation. It is sufficient and more desirable to have an
essential model of the product that adequately captures the design
Factor Effects Analysis
The effects of the control factors are calculated
in this step and the results are analysed to select the optimum
setting of the control factors.
Prediction / confirmation
In order to validate the optimum conditions we
predict the performance of the product design under baseline and
optimum settings of the control factors. Then we perform confirmation
experiments under these conditions and compare results with
predictions. If the results of confirmation experiments agree with the
predictions, then we implement the results. Otherwise, reiterate the
Step 1: Identify the mean
function, side effects and failure mode
Step 2: Identify the noise factors, testing
conditions, and quality characteristics
Step 3: Identify the function to be optimized
Step 4: Identify the control factors and their levels
Step 5: Select the orthogonal array matrix experiments
Step 6: Conduct matrix experiment
Step 7: Analyse the data, preduct the optimum levels and
Step 8: Perform vrification experiment and plan future action
To achieve the optimum design factor
setting, Taguchi advocated a combination of two stage process in which the
first step is related to the selection of robustness seeking factors and the
second step with the selection of adjustment factors to achieve the desired
target performance. The various stages in the experiment involve typically a
four-step or elaborated eight-step process shown in the Table 2. Robust
design results in a product or process that is insensitive to the effects of
sources of variability even when the sources themselves have not been
eliminated, by means of systematic approach (parameter design). It requires
the evaluation of controllable products or process control factors in noisy
environment from which the classical design of experiment methods seek
Orthogonal Arrays and
The design array used by Taguchi has the
basis from other design of experiments like, fractional factorial,
Plackett-Burman, Latin Square Design and mixed designs and consists of
symmetrical subsets of all combination of treatments [1, 4, 5, 10]. The
methodology minimizes performance or quality problems arising due to
non-identical operating or environmental conditions using a simplified
method known as orthogonal array experiment that helps to conduct a
multifactor experiment towards establishing the best product or process
The orthogonal arrays that Taguchi
advocated are saturated set of experiments allowing no scope for estimation
of interactions between control factors also known as inner array factors
but with additional interaction with an outer array consisting of noise
factors. In a given possible combinations of various levels of the variable,
the orthogonal array uses a special subset with a balanced nature, in which
the missing combinations can also be predicted . A manual procedure is
available that quickly completes the calculation of effects from
orthogonally designed experimental observations, using a special format
known as “response table”, for recording and manipulating the observed
data. This response table also includes random order column, using which the
randomisation can be carried out.
The major characteristics of orthogonal
arrays include following [10, 11]:
--Orthogonal arrays (OA) are special matrix experiments that allow the
experimenter to study the main factor effects of several design parameters
at once and efficiently.
--OA is a valid representation of the cause-effect relationship of the
process under study.
--The crux of the OA method lies in choosing the level combinations of the
input design variable for each experiment.
--The total number of rows in an OA determines the total number of
experiments to be run in the investigation.
--In any pair of columns in an OA, all combinations of the treatments occur
in equal number of times.
--Any treatment pair occurs once and only once between the two column, a
property known as “balancing property”.
--Any two columns of OA are mutually orthogonal.
--The experiments guided by an OA may not use all columns but it must use
every one of the array.
--Orthogonality implies that the entries in the array satisfy a special
mathematical condition, where sum of all weighting factors involved in the
equation that represents a function is zero. When additivity assumption
holds, it is possible to estimate the main factor effects using a single set
of experiments based on orthogonal design.
--Within the OA, further a sub-set can be analysed for a particular
--Use of OAs to plan matrix experiments also ensures that if errors in each
experiment are independent and have zero mean and equal variance and the
estimated factor effects are mutually uncorrelated.
Two factors are said to interact when
the influence of one on a response is found to depend on the setting of the
other factor. But when there are other factors also involved in this, the
factor estimates can be far from the true values, where the estimates can be
improved by replicating the trials. The averages found, by replication have
less variability and improve the precision. Replication of orthogonal
experiments can also help us to see the factors that affect the average
performance, factors that affect the variability of performance. In many
cases, the dependence between process performance (y) and influencing
parameters is generally restricted to the main effects, which are additive
cause-effect model, with the form,
y = µ + pi + qj + rk + sl + e
µ represents overall mean value of y in the region of experimentation in
which one varies other factors. Further, p1, p2 and p3 are the deviations of
y from µ caused by factor setting / levels with each factor with its own
positive and negative effects only. One assumes the factor effects to be
additive and separable from each other for a three level experiment, ie, p1
+ p2 + p3 = 0 and similar relationship for q, r and s also. If the average
variance for the error ei in a single experiment is (se)2, then average
error (e7 + e8 + e9) / 3 will have the variance se2 / 3. If the additivity
assumption is not valid, then error term will be independent of each other
and not a random variable with zero mean and variance.
Verification of additivity is, often, checked by the verification
experiment, with treatment set at known (usually the optimum) values and
observing the outcome. A close agreement between the observed and predicted
responses suggests that the reasonableness of the additivity assumption.
Experts often say that good experiments
do not always make good products but good experiments will provide important
information. Taguchi method states that whenever one does not completely
know the effects of different factors, one should “empirically” identify
the optimum settings of the design parameters by doing certain special
experiments. Such experiments are carried out by judiciously exploiting the
DP-noise interactions, after completing function design. While screening
various design parameters, using a suitable cause-effect diagram is very
much useful in elimination of several factors from consideration and also
can facilitate experimentation of important factors at multiple levels that
in turn has a good chance to show up their influences on the observations.
Also, at initial stages, keeping wider levels would reveal the
off-specification products, which would show the sensitivity of a parameter.
The following guidelines are useful
while selecting ‘right’ quality characteristics and maximize the chances
1.Quality characteristics (y) should be directly related to the basic
mechanism of the process or products
2.Characteristics should be easy to measure.
3.As far as possible, the measured quality characteristics should be a
While in the cases of other experimental
designs, uncontrollable factors are kept under observation while
experimentation without including them in the purview of the experiments,
Taguchi method provides the means to
include the effect of these factors in the experiments to make the
performance of the process or process a robust one. Taguchi called these
uncontrollable factors as noise factors a term derived from communication
industry. Noise factors are either too hard or uneconomical to control, even
though they cause unwanted variation in performance of the product or
process [7, 8, 9, 10]. Taguchi reduced all the noise factors to three
typical categories (Table 3), namely inner noise, outer noise and product
noise. These noise factors, at distinct levels, are included in the noise OA
as an outer array, while the levels of main factors are kept in the inner
array. Under certain conditions, noise factors are studied at several
levels, to improve the detection and exploitation of DP-noise interactions.
Customers are satisfied when products
perform on target and products that fail within ± d tolerances continue to
cause a quality loss. Based on the experiences, the loss due to inadequate
quality of a product or process has been found to be a quadratic loss
function, ie, L (y) = k (y-target)2 and one may try to maximize the
performance by minimizing the loss function. Instead of using the loss
function directly, Taguchi, also, formulated a simple statistic namely S/N
ratio, a logarithmic function, the ratio of mean performance to variation in
mean performance due to uncontrollable factors, which is concurrent
statistic and a special kind of data summary. S/N ratio is an ideal
measurement to decide the best values or levels of control factors. The S/N
ratio is the primary measurement used for products or process optimisation,
represents the ratio of sensitivity to variability, and is used to optimise
the robustness of a product or process .
In a set of statistical experiments, the
average quality characteristics and standard deviation (caused by noise
factors) are represented by µ and s with desired performance µo. Then one
must make an adjustment in the design to get performance on target and the
loss after adjustment is expressed as
in which µ2 represents the signal
component and s2 represents the variance or noise of the signal component.
or S/N ratio becomes equivalent to minimising the loss after
adjustment. For improving the additivity, the function is converted into
logarithmic function and expressed in decibels as S/N = 10 log 10 . The
maximisation of the S/N ratio by a suitable selection of the DPs makes the
robust. An appropriate S/N ratio needs to be selected for
The S/ N ratio is a predictor of the
quality loss that isolates the sensitivity of products to the noise factors.
In robust design, one minimizes the sensitivity to noise by seeking
combinations of the DP settings that maximize S/N ratio. The additivity of
DP effects also becomes maximum, in the most appropriate S/N ratio. One can
also select the most appropriate S/N ratio among several S/N ratios, for
both scaling factors and adjusting factors. Table 4 shows the typical S/N
ratios used in various situations.
A close agreement between the calculated
maximum values of S/N ratio with actual ratio (by graphical method) suggests
that the additivity assumption is a reasonable one, a prime requirement for
the main effect model and its predictability for any treatment combinations
within the influence space. If this verification fails, then the experiment
repeated with higher factor order interactions using a larger OA or some
other experimental design . Graphic evaluation methods of main effects
convey, rapidly, the relative magnitude of the different factor effects and
quick identification of optimum setting for each factor under the
experiment. They also display, visually, the relative effects of each of the
individual design factors.
Taguchi methods may also use ANOVA, to
determine the effect of a particular factor on the response or its
variability with F tests on S/N ratios in the robust design studies. Before
attempting regression exercise, cause-effect relationship between the
variable in question by ANOVA or some other similar method is carried out.
Taguchi Methods –
Relevances in Textiles
Optimisation of textile processes is
often cumbersome since they are easily affected by a number of controllable
and uncontrollable factors that may or may not be controllable. Taguchi
method treats optimisation problems in two categories namely static problems
that generally involve batch process optimisation that attain ‘one’
fixed performance level, eg, a static application for an injection moulding
machine finds the best operating condition for a single mould design and
dynamic problems, which is related to technology development, and similar
situations as contingency planning for some unknown future requirements [4,
In some engineering problems, the signal
factor is absent or it takes a fixed value and these problems are called
static problems and corresponding S/N ratios are called S/N ratios. In
dynamic applications, a signal factor moves the performance to some value
and an adjustment factor modifies the design’s sensitivity to this factor;
if the signal is plotted in horizontal axis and the response in the vertical
axis, the adjustment factor will change the slope of the line. The
adjustment factor adjusts the magnitude of change in a given setting. In
problems in which the signal and response must follow a function called the
“ideal function”, eg, linear relationship. Such problems are called
dynamic problems and the corresponding S/N ratios are called dynamic S/N
ratios [ 9].
Taguchi method has been used as a
screening tool to determine the robust settings in the saw ginning of seed
cotton using the design parameters including the paddle roll, saw and seed
finger roller components and with field cleaner as the noise component to
maximize FIBRE quality measurements . Effect of various process
parameters on drafted strands, in terms of relative fibre parallelisation,
modified coefficient of relative fibre parallelisation, fibre straightness
index and tenacities have also been analysed in the past under the response
type ‘larger-is-better’ . The following Table 5 lists certain
prominent applications of Taguchi’s optimisation method in the field of
Taguchi methods Vs
Statistical process control allows for
faults and defects to be eliminated after manufacture (if detected) whereas
Taguchi methods provide effective solution that prevents their occurrence.
One at a time approach is inefficient when the number of variables is many
and it can miss detection of critical interactions among the design
variables . There are various ways to optimise the effects of
controllable variables using design of experiments techniques like factorial
design, central composite, Box-Behnken, etc. The major drawback of all these
techniques is their inability to include the effect of uncontrollable
factors like environmental conditions, etc .
Taguchi method is simple technique
compared to many sophisticated experimental techniques like response surface
method is a combination of statistical experimental design fundamentals,
regression modeling technique and optimisation methods [11, 12, 20]. Taguchi
method has been considered to more effective than recently developed
algorithms like genetic algorithm methods, which in many occasions require
less human effect. Taguchi method is a scientifically disciplined mechanism
for evaluating and implementing improvements in products, processes,
materials and facilities.
The method is applicable over a wide
range of engineering fields that include processes that manufacture raw
materials, tuning the sub-systems in the engineering operations or in the
service sector. Taguchi method separately calculates the individual or main
effects of the independent variables on performance parameters while other
designs give collective effect of variable in terms of equations or three
dimensional curves or contour diagrams, which are often difficult to
understand and interpret .
Main drawback of the Taguchi method is
that it may not always determine the interactions effects like some of the
other design of experiment techniques. It also assumes that the effects of
each process variable on response is additive in nature, which is always not
true in many practical situations. However, use of ANOVA and regression
model in conjunction with the Taguchi method helps to quantify the
contribution of each process variable, changing the response and therefore
helps in ascertaining the additivity of the method.
1.Total Qualtiy Management,Volume 1,
Indira Gandhi Open University, New Delhi 2001, 225 – 231. 2.http://www.wter.org/loyola/polymers/c7_s6.htm.
3.Bass I: Introduction to Taguchi Method Part I, through http://www.sixsigmafirst.com/intro2
5. Holt G, Laird W: Screening for Optimal Operating Parameters for the
Powered Roll Gin Stand using Taguchi’s Robust Design, The Journal of
Cotton Science, 11 (2007) 79 – 90.
7. Bass I: Introduction to Taguchi Method – Part II, through http://www.sixsigmafirst.com/intro2
8. Salhotra K R, Ishtiaque S M, and Kumar A: Analysis of Spinning Process
using the Taguchi Method – Part I Effect of Spinning Process Variables on
Fibre Orientation and Tenacities of Sliver and Roving, Journal of the
Textile Institute, 97 (4) (2007) 271 – 283.
9. Phadke M S: Introduction to Robust Design, through http://www.isixsigma.com/library/content/c020311a.asp.
10.Taguchi Methods Explained, T P Bagchi, Prentice Hall of India, New Delhi,
11.Yeniay O: A Comparison of the Performance Between a Genetic Algorithm and
the Taguchi Method over Artificial Problems, Turkish Journal of Engineering
and Environmental Science, 25 (2001) 561 – 568.
12. Mc Millan A Boyce G: Processing of Large Surface Area Components from
13. Karbhari V M: Product and Process Development Methods, http://www.wtec.org/loyola/polymers/c7_s1.htm.
14. Salhotra K R, Ishtiaque S M, Kumar A: Analysis of Spinning Process using
the Taguchi Method: Part II – Effect of Spinning Process Variables on
Fibre Extent and Fibre Overlap of Ring, Rotor and Air-jet Yarns, Journal of
the Textile Institute, 97 (4) 2006 285 – 293.
15. Kumar A, Ishtiaque S M, and Salhotra K R: Analysis of Spinning Process
using the Taguchi Method: Part IV – Effect of Spinning Process Variables
on Tensile Properties of Ring, Rotor and Air-jet Yarns, Journal of the
Textile Institute, 97 (5) 2006 385 – 390.
16. Kumar A, Ishtiaque S M, and Salhotra K R: Analysis of Spinning Process
using the Taguchi Method: Part III Effect of Spinning Process Variables on
Migration Parameters of Ring, Rotor and Air-jet Yarn, Journal of the Textile
Institute, 97 (5) 2006 377 – 384.
17. Webb C J, Waters G T, Thomas A J, and Liu G P: The Use of the Taguchi
Design of Experiment Method in Optimising Splicing Conditions for a Nylon 66
Yarn, Journal of the Textile Institute, 98 (4) 2007, 327 – 336.
18. Farsani R E, Raissi S, Shokuhfar A and Sedghi A: Optimisation of Carbon
FIBREs Made up of Commercial Polyacrylonitrile FIBRE using Screeing Design
19. Fung C P, and Kang P C: Multi-response Optimisation in Friction
Properties of PBT Composite using Taguchi Method and Principal Component
Analysis, Journal of Materials Processing Technology, 170 (3) 2005 602 –
20. Ray S: A Statistical Tool for Process Optimisation, Indian Textile
Journal, 2006 (12) 24 – 30.
Note: For detailed version of this
article please refer the print version of The Indian Textile Journal June
Dr T Ramachandran
Department of Textile Technology,
PSG College of Technology,
Coimbatore, Tamil Nadu.
Department of Textile Technology,
Bannari Amman Institute of Technology,
Sathyamangalam, Tamil Nadu.