RobertNiles.com
About Robert
Reporting Help
Finding Data on the Internet
Questions for Candidates
Stats Lessons
Mean
Median
Percent
Per capita
Standard Deviation
Margin of Error
Data Analysis
Sample Sizes
Stats Tests
Help Board
Bookstore


Statistics for the Utterly Confused

This book offers a super-accessible approach to the much-misunderstood subject of statistics.
More information
-->

Home > Statistics Every Writer Should Know > The Stats Board > Discusssion

gage R & R
Message posted by Roy on June 1, 2000 at 12:00 AM (ET)

Can someone explain the basics of a gage R & R study and what the results mean? Two pieces of similar equipment were recently studied; one had an R&R of >30%, the other, 22.8%. What's it mean?


READERS RESPOND:
(In chronological order. Most recent at the bottom.)

Re: gage R & R
Message posted by nancy diehl on June 2, 2000 at 12:00 AM (ET)

Gage R&R first of all means gage Repeatability & operator Reproducibility.
The purpose for the study is to determine how much of your allowable part
tolerance is chewed up by your measurement system. The industry standard
says that the two sources of variation, the gage and the operator, should
not exceed more than 30% of your total part tolerance. The percentage
is determined by using your part tolerance in the denominator and
the standard deviation calculated from a study set up to measure the
operator and gage variation in the numerator. There are two methods used
to calculate the gage R&R variation. One is called the range method and
the other is the ANOVA method. The range method was developed for ease of
calculations - "R-bar divided by d2" is used as an estimate of standard
deviation. I perfer the more rigorous approach using the ANOVA method
especially if you have a computer to do the calculations for you.

To simplify this even further let's remove the operator from the study and
just address gage repeatability. To run this study you would take 10 to
15 parts, measure them once using only one operator/inspector and then
measure them a second time using the same operator (you can do it a third
time too if you so wish). Then plug the measurements into a software
package that can do a One-Way ANOVA. The parts become the "between" or
"treatment" source of variation and the error term, or the "within" is
actually a measure of the gage repeatability. Take the square root of the
Mean Square Error value and this is the standard deviation of the gage
repeatability. Multiply this value by 5.15 (5.15 represents 99% of the area
under the normal curve) and then divide it by the total part tolerance.
This represents the percentage (after multiplying this by 100) of the total
tolerance that is chewed up due to the gage. It lets you know whether you
have an acceptable measurement system. You could be rejecting parts that are
actually good if the problem lies with having an unacceptable measurement
system.

P.S. - the mathematics become a little more complicated when you add the second
source of variation to this analysis - i.e. the operator. Better off letting
the computer do the calculations.


Re: gage R & R
Message posted by Phil on June 3, 2000 at 12:00 AM (ET)

The previous post is very good. The "percentage" produced is sometime called the P/T Ratio (Precision to Tolerance Ratio). Sometimes people use 6 standard deviations in computing it rather than 5.15, depending on the way they learned to compute it.

Also, for non "high precision" situations, some people try to keep the P/T ratio under 10% (e.g. micrometers). That is hard to do for higher precision processes so 25%-30% is acceptable then.

On as side note: The very first consulting job I ever did in SQC was doing process capability studies in a large CNC facility. They were making oil drilling bits. It turned out their measurement system was so bad that the data was useless (P/T was 56%). When we showed the QC Manager the R&R study he fired us because it made him look bad. They went out of business not too long after that.


Re: Gage R & R
Message posted by Curt (via 208.28.201.2) on June 28, 2001 at 12:58 PM (ET)

Question: A customer has asked for a Gage R & R on a 95 x 95 sample. Is he asking for the 95% confidence interval with a 5% error? I have reviewed my literature on sampling and can not find a specific definition of "95 x 95".


Re: gage R & R
Message posted by Phil (via 216.175.112.5) on June 29, 2001 at 3:50 AM (ET)

Never heard of that terminology before. i am anxious to see if anyone else has.


Re: gage R & R
Message posted by JG (via 128.8.22.71) on June 29, 2001 at 9:24 PM (ET)

The terminology is new to me also, but the ideas seem reasonable. Quality control seems to have become a separate profession with it's own terminology.


Re: gage R & R
Message posted by Ravi Prasad.P (via 202.144.86.99) on August 25, 2001 at 6:12 AM (ET)

how to carry out Gage R & R studies on test & measuring equipment.


Re: gage R & R
Message posted by Craig (via 12.33.176.99) on August 27, 2001 at 9:14 AM (ET)

Can anyone tell me the difference between and ANOVA R&R and the Average and Range meathod?


Re: gage R & R
Message posted by Dan A (via 151.205.1.15) on September 4, 2001 at 5:43 AM (ET)

Can a R&R study be performed on equipment that does destructive testing? If so, how? If not, what alternatives do I have?



Your $5 contribution helps cover part the $500 annual cost of keeping this site online.

Niles Online Sites:RobertNiles.comTheme Park InsiderViolinist.com

RobertNiles.com™, the site, content and services 咀opyright 1996-2002, Robert Niles.
All rights reserved. Questions? Comments? Read my Privacy Policy, or E-mail me!