Posted in Operations & IT Articles, Total Reads: 2634
, Published on 20 April 2014
“Change is the only constant, everything else is constantly changing”.
This age-old adage perfectly elucidates the dynamic nature of the ream we reside in…and the world of operations and its standards is no exception.
Case in point, being the Six-Sigma methodology. Pioneered in 1986 by Motorola, Six-Sigma is a well-structured, data driven methodology for minimizing the defects in any process throughout its life cycle.
It follows two project methodologies of DMAIC (Define-measure-Analyze-Improve-Control) and DMADV (Define-Measure-Analyze-Design-verify), which are self explanatory in nature.
Image Courtesy: freedigitalphotos.net
Statistically speaking, the defect rate of a process strictly conformant to the six-sigma standards should not exceed 3.4 parts per million, which translates to a infinitesimally small(.00034%) probability of a defective product being produced. However technologically infeasible that might sound, there are companies which have successfully implemented and benefitted immensely from this methodology. A prime example is General electric , which after adopting six-sigma realized estimated benefits to the tune of $ 10 billion. That figure overshadows the GDP of many nations!
All things considered, if Six-Sigma really is the panacea to all ills, the gateway to fortune, why is there growing apprehension about its relevance in the modern day scenario? The answer to that is manifold.
Firstly, Let us look at some six-sigma from a QMS(Quality Management System) perspective. Six-Sigma was originally conceived as a subset of QMS(ISO 9001), to continually improve the QMS. However, over a period of time, it has essentially supplanted QMS, as companies tended to improperly apply QMS, and did not realize its intended benefits. This causes them to underrate its instrumentality and consequently bypass essential quality standards altogether. Some smaller companies even substituted economical, efficient and highly customized internal quality standards with the expensive and convoluted six-sigma methodology, they could ill-afford, and ended up suffering irrecoverable losses. However this is primarily due to hype generated around six-sigma, and can be worked around by proper assessment of the nature of process improvement required.
Let us now have a look at the technical aspects of six-sigma,
The first of the criticisms targets the very foundation of six-sigma, the normal distribution model, as its ability to accurately represent processes is suspect. However in light of alternative process models( Students t distribution etc.), which erroneously compute the probability of a defective product to be much higher, six-sigma emerges triumphant.
Another criticism on the technical front is the deviation of the process mean by 1.5 sigma. However in all practicality, it is unrealistic to expect the mean to stay centered on target.
Another major caveat of Six-sigma, which is often bloated out of proportion, deals with the “All or nothing” implementation issues associated with six-sigma. It’s implementation requires many organizations to restructure and completely overhaul their business operations, as six-sigma cannot be localized to a particular department, and must encompass the organization in its entire length and breadth.
There are also complains regarding the disruption in the organizations operations, which inevitably follows after most companies tend to devote upwards of 80% of their employees to the acquisition of six-sigma certifications like Black-belt and Green-belt. Some others doubt the the very credibility of these certifications, which can be acquired with a meager 20 days of training, and may not be indicative of one’s proficiency with the methodology.
Both are quite valid concerns, but a simple cost benefit analysis, would prove the benefits provided by six-sigma to greatly offset the effort involved in its effective implementation. However some work is required to restore the faith in the certifications.
Looking at it from another perspective, Six-Sigma was born due to an economic boom, at a time when customer needs were difficult to understand and implement, and a radical methodology like Six-Sigma was seen as divine providence. However, in this day and age, with a broken economy ravaged by multiple recessions, where even the state-of-art products fade into obsolescence the very next day, Six-Sigma shouldn’t be the only cynosure .
There has been a paradigm shift, and the need of the hour is a new methodology which dynamically responds to consumer needs and analyze their behavior, to provide them with the desired solution. And as the late Steve Jobs said “ The consumers have no idea what they want”, we should probably shift our fixation from “evolution” to “revolution”, by focusing on innovation, without completely sidelining six-sigma. And we have several Business Process management like Agile, Serum etc, at our disposal, to achieve exactly that.
So coming back to the question: Is Six-Sigma really growing redundant and irrelevant with the passing second?
In a nutshell:”No”, because even though the benefits derived from six-sigma are becoming progressively insufficient, they are still plentiful, and the basic thought-process associated with it will persevere in any methodology.
Abandoning the Six-Sigma school of thought, and trying to implement a new methodology would be analogous to constructing a building, without the supporting metalwork in place. In other words, a recipe for disaster!
This article has been authored by Anant Nawalgaria from D.O.M.S. IIT Madras
2. Book on Data analytics by Stein and foster.
If you are interested in writing articles for us, Submit Here