All entries for Wednesday 10 November 2010
November 10, 2010
Deming's basic quality philosophy is that" productivity improves as variability decreases"
there are 2 types variation:chance and assignable. but the most difficulty problem is how to comprehend them.
Common cause variability is a source of variation caused by unknown factors that result in a steady but random distribution of output around the average of the data. Common cause variation is a measure of the process's potential, or how well the process can perform when special cause variation is removed. Therefore, it is a measure of the process technology. Common cause variation is also called random variation, noise, noncontrollable variation, within-group variation, or inherent variation. Example: Many X's with a small impact.
Unlike common cause variability, special cause variation is caused by known factors that result in a non-random distribution of output. Also referred to as "exceptional" or "assignable" variation. Example: Few X's with big impact. Special cause variation is a shift in output caused by a specific factor such as environmental conditions or process input parameters. It can be accounted for directly and potentially removed and is a measure of process control.
one of the most effective quality tool to distingguish them is control chart.
but as we know there are two types of mistakes exist in statistics:error of the first kind and error of the second kind.
firsrt one is objective right but we think its wrong after we analyse the data, think there's special causes exists.(right ¡úwrong)
second one is that special causes exist but we think it is common, cannot find the problem.(wrong ¡úright)
so, problems always exist. when data appear out of 6sigma,we have to know it's only a phenomena not the nature of this situation.
after initial analysis,take action to the assumed special causes to test whether it's special or not. if it is not, the first error comes up,our aim is to minimize the happening probability of these two errors.when quality level reaches 3 sigma this probability is lowest. so this is why +-3sigma in the control chat as nature limits.
if data locates out of the up and down limits, special causes may exist.
if data not distributes radomly within limilts,special causses may exist.
some industry will use 2sigma as nature limits will increase the probability of the first kind of error.this will waste some time but make sure more special causes can be found.
some industry use 4 sigma as nature limits increase the probability of the second error decrease the first one. sepcial causes will be ignored.
if the process is not stable , a lot of sepcial cause exists means the system is out of control ,we need to improve the process.
before i only heard about the SIX SIGMA as a managment method after had lacture I start know deeper about it.
From my understanding, six sigma is a quality mgt which uses statistical methods to seek perfect manufacturing,minizing costs,increasing productivity and put customer satisfaction at first place.
when sigma figure goes up, variation goes down.
traditional quality managment always focus on results, they ensure the quality by emphasizing on final product checking and customer service whlist six sigma emphasize on actual causes of these variation. Quality is maintained by improving the process saving original cost rather than checking the final product strictly.Because during the production process these waste products already make a loss, after service will make another amout of costs.