SlideShare una empresa de Scribd logo
1 de 98
Measurement Systems Analysis (MSA)

© 2001 Six Sigma Academy

1
Why Measure?
• To understand a decision:
• Meet standards & specifications
• Detection/reaction oriented
• Short-term results
• Stimulate continuous improvement:
• Where to improve?
• How much to improve?
• Is improvement cost effective?
• Prevention oriented
• Long-term strategy

“If you cannot measure, you cannot improve!”
– Taguchi
© 2001 Six Sigma Academy

2
Measurement System As A Process
Material

Method

Machine

Cleanliness
Sequence
Cleanliness
Temperature
Temperature
Timing
Dimension
Design
Positioning
Weight
Precision
Corrosion
Calibration
Location
Hardness
Resolution
Set-up
Conductivity
Stability
Density
Preparation
Wear
Compliance-procedure
Fatigue
Vibration
Attention
Calculation error
Atmospheric pressure
Interpretation
Speed
Lighting
Coordination
Knowledge-instrument
Temperature
Dexterity
Vision
Humidity
Cleanliness

Environment

© 2001 Six Sigma Academy

Measurement
Error

People

3
What Is An MSA?
Scientific and objective method of analyzing the validity of a
measurement system
• A “tool” which quantifies:
1.
Equipment Variation
2.
Appraiser (Operator) Variation
3.
The Total Variation of a Measurement System
• MSA is NOT just Calibration
• MSA is NOT just Gage Repeatability & Reproducibility (R&R)

Measurement System Analysis is often a “project within a project”
© 2001 Six Sigma Academy

4
MSA Relationship To DMAIC
Define

Measure

Analyze

Improve

Control

Measurement Systems Analysis
• Quantitative evaluation of tools and processes used in making
discrete or variable observations
Define

Measure

Analyze

Improve

Control

Measurement Systems Control
• Established, documented, and continuously carried out
• Ensures measurement system maintains an acceptable status
• Often referred to as “Long Term Gage Plan”

© 2001 Six Sigma Academy

5
MSA - A Starting Point
Before you…
• Make adjustments
• Implement solutions
• Run an experiment
• Perform a complex statistical analysis
You should…
• Validate your measurement systems
• Validate data and data collection systems

MSA quantifies a major source of process variation
© 2001 Six Sigma Academy

6
Measurement Systems
• Examples
• Precision gage
• Data collection form
• Survey
• School entrance exam
• Customer satisfaction
• On-time delivery report

What is your system ?
© 2001 Six Sigma Academy

7
Types of Measurement System Analysis

• Operational Definitions
• Walking the Process
• Gage R&R
• Variable Data
• Attribute Data

© 2001 Six Sigma Academy

8
MSA – Operational Definitions

The Measurement System can be
validated using Operational Definitions
constructed by the Project Team to
ensure that all measurement takers completely
understand what is expected during the data
collection phase.

© 2001 Six Sigma Academy

9
Developing Operational Definition

• Operational definitions are descriptions written in
a way that ensures consistent interpretation by
different people
• The operational definition method of description
will be used throughout the DMAIC process

© 2001 Six Sigma Academy

10
• Operational Definition
• The technique of defining an item, process or characteristic using
Operational Definitions is an effective way to communicate between
Team Members and other people involved in the project. Because
Operational Definitions are so effective, the technique is used in a
number of locations within the DMAIC process. Remember, to be
effective, an Operation Definition must be written in a way that
ensures consistent interpretation by different people.CC

© 2001 Six Sigma Academy

11
General Example – Operational Definitions

• Examples of Operational Definitions for data collection:
• Record the date that the lease company written notification arrives
in the dealership using an MM/DD/YY format.
• List any cosmetic preparation in excess of the standard predelivery process required to render the vehicle acceptable for retail
consumer sale.
• Record the weight of each package of coffee in ounces by pouring
the coffee into the filter and placing the filter and coffee on the
scale tray.
• Record the length of time that coffee remains in the urn by
recording the actual time of day each time the Brew button is
pressed to recharge the urn. Use 24-hour clock and round to the
nearest minute.
© 2001 Six Sigma Academy

12
MSA – Walking the Process

“Walking the Process” is a method
of conducting MSA when it is not possible
to perform a Gage R&R.

© 2001 Six Sigma Academy

13
How to “Walk the Process

• Develop Operational Definitions for each of the measures to be
collected
• Train data collectors prior to beginning the data collection activity
• Follow the process from beginning to end and monitor the data
collection activities to determine if data is being collected properly
• Continue walking the process until the data compiled accurately
reflects the existing process

© 2001 Six Sigma Academy

14
Components Of Measurement Error

© 2001 Six Sigma Academy

15
Components Of Measurement Error
•
•
•
•
•
•

Resolution/Discrimination
Accuracy (bias effects)
Linearity
Stability (consistency)
Repeatability-test-retest (Precision)
Reproducibility (Precision)

Each component of measurement error can contribute to variation,
causing wrong decisions to be made
© 2001 Six Sigma Academy

16
Categories Of Measurement Error Which
Affect Location
Accuracy/
Bias

Linearity

© 2001 Six Sigma Academy

Stability

17
Categories Of Measurement Error Which
Affect Spread

Precision

Repeatability

© 2001 Six Sigma Academy

Reproducibility

18
Resolution/Discrimination

Resolution?

Can change be detected?

OK
Accuracy/Bias?
OK
Linearity?
OK
Stability?
OK
Precision (R&R)?

© 2001 Six Sigma Academy

19
Resolution
•
•
•
•
•
•

Simplest measurement system problem
Poor resolution is a common issue
Impact is rarely recognized and/or addressed
Easily detected
No special studies are necessary
No “known standards” are needed

© 2001 Six Sigma Academy

20
Definitions:
• Resolution/Discrimination
• Capability to detect the smallest tolerable changes
• Inadequate Measurement Units
• Measurement units too large to detect variation present
• Guideline: “10 Bucket Rule”
• Increments in the measurement system should be one-tenth the
product specification or process variation

© 2001 Six Sigma Academy

21
Resolution/Discrimination
Poor Discrimination

Same process
output being
measured

1

2

3

4

5

1
Better Discrimination

1

2

3

4

5

1.3

© 2001 Six Sigma Academy

22
Resolution Actions
•
•
•
•
•

Measure to as many decimal places as possible
Use a device that can measure smaller units
Live with it, but document that the problem exists
Larger sample size may overcome problem
Priorities may need to involve other considerations:
• Engineering tolerance
• Process Capability
• Cost and difficulty in replacing device

© 2001 Six Sigma Academy

23
Accuracy/Bias
Resolution?
OK
Accuracy/Bias?

Measurements are “shifted”
from “true” value

OK
Linearity?
OK
Stability?
OK
Precision (R&R)?

© 2001 Six Sigma Academy

24
Accuracy/Bias
Difference between the observed average value of measurements and
the master value
Master Value
(Reference Standard)

Master value is an accepted,
traceable reference standard

© 2001 Six Sigma Academy

Average
Value

25
Accuracy/Bias

x x
x
x xx
x
x
x
More accurate

© 2001 Six Sigma Academy

x x
x
x xx
x
x
x
Less accurate

26
Accuracy/Bias Actions
•
•
•
•
•

Calibrate when needed/scheduled
Use operations instructions
Review specifications
Review software logic
Create Operational Definitions

© 2001 Six Sigma Academy

27
Linearity
Resolution?
OK
Accuracy/Bias?
OK
Linearity?
OK

Measurement is not “true”
and/or consistent across the
range of the “gage”

Stability?
OK
Precision (R&R)?

© 2001 Six Sigma Academy

28
Linearit
Observed
Average Value

Bias

No Bias

Reference Value
Full Range of Gage
© 2001 Six Sigma Academy

29
Linearity Actions
•
•
•
•

Use only in restricted range
Rebuild
Use with correction factor/table/curve
Sophisticated study required and will not be discussed in this course

© 2001 Six Sigma Academy

30
Stability
Resolution?
OK
Accuracy/Bias?
OK
Linearity?
OK
Stability?

Measurement drifts

OK
Precision (R&R)?

© 2001 Six Sigma Academy

31
Stability
• Measurements remain constant and predictable over time
• For both mean and standard deviation
Master Value
(Reference Standard)
• No drifts, sudden shifts, cycles, etc.
• Evaluated using control charts
Time 1
Time 2

© 2001 Six Sigma Academy

32
Stability Actions
•
•
•
•

Change/adjust components
Establish “life” timeframe
Use control charts
Use/update current SOP

© 2001 Six Sigma Academy

33
Precision
Resolution?
OK
Accuracy/Bias?
OK
Linearity?
OK
Stability?
OK
Precision (R&R)?

© 2001 Six Sigma Academy

Repeatability and
Reproducibility

34
Precision
σ2total = σ2product/process + σ2repeatability + σ2reproducibility
Master Value

Good Precision

Poor Precision

A

B
Also known as Gage R&R

© 2001 Six Sigma Academy

35
Repeatability (A Component Of Precision)
• Variation that occurs when repeated measurements are made of the
same item under absolutely identical conditions
• Same:
• Operator
• Set-up
• Units
• Environmental conditions
• Short-term

© 2001 Six Sigma Academy

36
Reproducibility (A Component Of Precision)
The variation that results when different conditions are used to make the
measurements
• Different:
• Operators
• Set-ups
• Test units
• Environmental conditions
• Locations
• Companies
• Long-term

© 2001 Six Sigma Academy

37
R&R Actions
Repeatability
• Repair, replace, adjust equipment
• SOP
Reproducibility
• Training
• SOP

© 2001 Six Sigma Academy

38
Attribute Measurement Studies

© 2001 Six Sigma Academy

39
Purpose Of Attribute MSA
•
•
•
•
•

Assess standards against customers’ requirements
Determine if all appraisers use the same criteria
Quantify repeatability and reproducibility of operators
Identify how well measurement system conforms to a “known master”
Discover areas where:
• Training is needed
• Procedures are lacking
• Standards are not defined

© 2001 Six Sigma Academy

40
Attribute MSA - Excel Method
• Allows for R&R analysis within and between appraisers
• Test for effectiveness against standard
• Limited to nominal data at two levels

© 2001 Six Sigma Academy

41
Attribute MSA Example
5

Attribute Legend (used in computations)
1 Pass
2 Fail

Open file MSA-Attribute.xlsOperator #1
Known Population
Sample #
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
© 2001 Six Sigma Academy

Attribute
Pass
Pass
Pass
Pass
Fail
Fail
Pass
Pass
Fail
Pass
Pass
Pass
Pass
Pass
Fail
Pass
Pass
Pass
Fail
Pass
Pass
Pass
Pass
Pass
Fail
Pass
Pass
Pass
Fail
Pass

Try #1
Pass
Pass
Pass
Pass
Fail
Pass
Pass
Pass
Fail
Pass
Pass
Pass
Pass
Pass
Fail
Pass
Pass
Pass
Fail
Pass
Pass
Fail
Pass
Pass
Fail
Pass
Pass
Pass
Fail
Pass

Try #2
Pass
Pass
Pass
Pass
Fail
Pass
Pass
Pass
Fail
Pass
Pass
Pass
Pass
Pass
Fail
Pass
Pass
Pass
Fail
Pass
Pass
Fail
Pass
Pass
Fail
Pass
Pass
Pass
Fail
Pass

DATE: 1/4/2001
NAME: Acme Employee
PRODUCT: Widgets
BUSINESS: Earth Products
Operator #2
Try #1
Try #2
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Fail
Fail
Pass
Pass
Pass
Pass
Pass
Pass
Fail
Fail
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Fail
Fail
Pass
Pass
Pass
Pass
Pass
Pass
Fail
Fail
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Fail
Fail
Pass
Pass
Pass
Pass
Pass
Pass
Fail
Fail
Pass
Pass

Microsoft Excel Worksheet

Operator #3
Try #1
Try #2
Pass
Pass
Pass
Pass
Pass
Pass
Fail
Pass
Pass
Fail
Pass
Pass
Pass
Pass
Pass
Pass
Fail
Fail
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Fail
Pass
Pass
Fail
Pass
Pass
Pass
Pass
Pass
Pass
Fail
Fail
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Fail
Pass
Fail
Fail
Pass
Pass
Pass
Pass
Pass
Pass
Fail
Fail
Pass
Pass

42
Scoring Example
% APPRAISER SCORE - >

100.00%

78.57%

100.00%

% SCORE VS. ATTRIBUTE - >

78.57%

64.29%

71.43%

SCREEN % EFFECTIVE SCORE - > 57.14%
SCREEN % EFFECTIVE SCORE vs. ATTRIBUTE - >

42.86%

• 100% is target for all scores
• <100% indicates training required
• % Appraiser score = repeatability
• Screen % Effectiveness Score = reproducibility
• % Score vs. Attribute
• individual error against a known population
• Screen % Effective vs. Attribute
• Total error against a known population

© 2001 Six Sigma Academy

43
Statistical Report

© 2001 Six Sigma Academy

44
Statistical Report

© 2001 Six Sigma Academy

45
Statistical Report

Continued

© 2001 Six Sigma Academy

46
Attribute MSA – MINITAB™ Method
•
•
•
•

Allows for R&R analysis within and between appraisers
Test for effectiveness against standard
Allow nominal data with two levels
Allows for ordinal data with more than two levels

© 2001 Six Sigma Academy

47
MINITAB Method - Data Entry
• Same data as Excel example
• Arranged in multiple columns
• Data can also be stacked in single column

© 2001 Six Sigma Academy

48
Attribute Study - MINITAB Analysis
Attribute MSA.mpj

Attribute MSA.MPJ

Tool Bar Menu > Stat > Quality
Tools > Attribute Gage R&R Study

© 2001 Six Sigma Academy

49
Attribute Study - MINITAB Analysis

Continued

1. Select “Single Column” if data is stacked

1. Select “Multiple
Columns” if data is
un-stacked

2. Enter number of
appraisers and trials

3. Enter name of column
with “Known”
© 2001 Six Sigma Academy

4. Select OK

50
Attribute MSA - MINITAB Graphical Output
Date of study: 1/03/2001
Reported by: Jose
Name of product: XYZ Report
Misc:

Assessment Agreement

Lower
variation
within
appraiser

Within Appraiser

Appraiser vs Standard

Lower
variation
appraiser vs.
standard

100

100

[ , ] 95.0% CI
Percent
90

Percent

Percent

90

80

80

70

Higher
variation
within
appraiser

70

60
Bob

Sue

Appraiser

Tom

Bob

Sue

Tom

Appraiser

Higher
variation
appraiser vs.
standard

Not included if no “Known”
© 2001 Six Sigma Academy

51
Attribute MSA – MINITAB Session Window
Results
Each Appraiser vs. Standard

Individual vs.
Standard

Assessment Agreement
Appraiser # Inspected # Matched Percent (%)

95.0% CI

Bob

30

28

93.3 ( 77.9,

99.2)

Sue

30

29

96.7 ( 82.8,

99.9)

Tom

30

24

80.0 ( 61.4,

92.3)

# Matched: Appraiser's assessment across trials agrees with standard.
Assessment Disagreement
Appraiser # Pass/Fail Percent (%) # Fail/Pass Percent (%)

# Mixed Percent (%)

Bob

1

3.3

1

3.3

0

0.0

Sue

1

3.3

0

0.0

0

0.0

Tom

1

3.3

0

0.0

5

16.7

# Pass/Fail: Assessments across trials = Pass/standard = Fail.

Disagreement
assessment
(repeatability)

# Fail/Pass: Assessments across trials = Fail/standard = Pass.
# Mixed: Assessments across trials are not identical.
Between Appraisers
Assessment Agreement
# Inspected # Matched Percent (%)
30

24

95.0% CI

80.0 ( 61.4,

92.3)

# Matched: All appraisers' assessments agree with each other.
All Appraisers vs. Standard
Assessment Agreement
# Inspected # Matched Percent (%)
30

23

Total agreement
(against known)

95.0% CI

76.7 ( 57.7,

90.1)

# Matched: All appraisers' assessments agree with standard.
© 2001 Six Sigma Academy

Between appraisers
(reproducibility)

52
MINITAB Method - Ordinal Data Entry
Ordinal MSA.mtw
• Survey data rated on a 1 to 5 scale
• Arranged in multiple columns

© 2001 Six Sigma Academy

Minitab Worksheet

53
Attribute Study - Ordinal

Select “categories of the
attribute data are
ordered”

Analysis is same as 2 level data
© 2001 Six Sigma Academy

54
Industrial Attribute MSA Exercise
•
•
•
•

Evaluate samples supplied by instructor
Determine the screen and appraiser scores
Interpret the results
Recommend actions

iGrafx Professional Document

attributecircles.MPJ

© 2001 Six Sigma Academy

55
Variables Measurement Studies

© 2001 Six Sigma Academy

56
Six Step Variables MSA
1.
2.
3.
4.
5.
6.

Conduct initial gage calibration (or verification)
Perform trials and data collection
Obtain statistics via MINITAB
Analyze, interpret results
Check for inadequate measurement units
On-going evaluation
• What would be your long-term gage plan ?

© 2001 Six Sigma Academy

57
Trials And Data Collection
• Generally two to three operators
• Generally 5-10 process outputs to measure
• Each process output is measured 2-3 times (replicated) by each
operator
O p e r1
P1
1

2

P2
3

1

2

O p e r2

P3
3

1

2

P4
3

1

2

P5
3

1

2

P1
3

1

2

O p e r3

...
3

1

2

P5
3

1

2

P1
3

1

2

...
3

1

2

P5
3

1

Randomization is Critical
© 2001 Six Sigma Academy

58

2

3
Randomization, Repeats, Replicates
Randomization
• Runs are made in an arbitrary vs. patterned order
• Average out effects of noise or unknown factors
• Tradeoff - Invalid results versus slight inconvenience (if any)
Repeats
• Running more than one sample of a single run
• Results are averaged
Replication
• Running entire experiment in a time sequence
• MSA allows for repeatability study

© 2001 Six Sigma Academy

59
Variables MSA - MINITAB Example
Variable MSA.mtw
USL=1.
Replicate 1
5
LSL=0.
5

© 2001 Six Sigma Academy

Replicate 2

Variable MSA.MTW

(Randomized order)

60
MSA Using MINITAB
10 Process Outputs
3 Operators
2 Replicates

USL=1.
5
LSL=0.

Replicate 1

Replicate 2
(Randomized order)

5

• Have Operator 1 measure all
samples once (as shown in
the outlined block)
• Then, have Operator 2
measure all samples once
• Continue until all operators
have measured samples
once (this is Replicate 1)
• Repeat these steps for the
required number of
Replicates
• Enter data into MINITAB in 3
columns as shown
© 2001 Six Sigma Academy

61
Manipulate The Data

Your data in MINITAB should initially
look like this. You will need to STACK
your data so that all like data is in one
column only
Use the commands
> Manip
> Stack
> Stack Blocks of Columns
(Stack all Process Outputs,
Operators, and Responses so
that they are in one column only)

Now you are ready to run the
macro for data analysis

© 2001 Six Sigma Academy

62
Stacked And Ready For Analysis

Note:
c10, c11, c12 are the columns in
which the respective data are
found IN OUR EXAMPLE. You
must have ALL data STACKED in
these columns
Enter titles

© 2001 Six Sigma Academy

63
Prepare The Analysis
Use the commands
> Stat > Quality Tools
> Gage R&R Study (Crossed)
Each process output
measured by each
operator
OR
> Gage R&R Study (Nested)
For “destructive tests”
where each process
output is measured
uniquely by each
operator
© 2001 Six Sigma Academy

64
Choose Method Of Analysis

Enter Gage
Info and
Options

ANOVA method is preferred
• Gives more information

© 2001 Six Sigma Academy

65
Adding Tolerance (Optional)
Upper Specification
Limit (USL)
Minus
Lower Specification
Limit (LSL)
For this example:
USL=1.0
USL=1.0
LSL=0.5
LSL=0.6
USL - LSL=0.50

© 2001 Six Sigma Academy

66
MSA Output:
Session Window

Graphs

Two-Way ANOVA Table With Interaction
DF

SS

MS

F

P

9

2.05871

0.228745

39.7178

0.00000

Operator

2

0.04800

0.024000

4.1672

0.03256

Operator*Part

18

0.10367

0.005759

4.4588

0.00016

Repeatability

30

0.03875

0.001292

Total

59

2.24912

Gage R&R

Total Gage R&R

0.004437

100

Gage R&R

Repeat

Reprod

Part

Part-to-Part

1

2

3

R Chart by Operator
0.15

Sample Range

VarComp

By Part
1.1
1.0
0.9
0.8
0.7
0.6
0.5
0.4

%Contribution
%Study Var
%Tolerance

0

%Contribution

Source

Components of Variation
200

Percent

Part

(of VarComp)
10.67

0.001292

3.10

Reproducibility

0.003146

7.56

Operator

0.000912

2.19

Operator*Part

0.002234
0.037164

89.33

Total Variation

0.041602

UCL=0.1252

0.05

R=0.03833

0.00

LCL=0

100.00

1.1
1.0
0.9
0.8
0.7
0.6
0.5
0.4
0.3

Operator
1

StdDev

Study Var

%Study Var

(5.15*SD)

(%SV)

0.066615

0.34306

32.66

1

0.035940

0.18509

17.62

0.056088

0.28885

27.50

3

0.030200

0.15553

14.81

Part

1
2
3

1

2

3

4

5

6

7

8

31.11

Operator*Part

0.047263

0.24340

23.17

48.68

Part-To-Part

0.192781

0.99282

94.52

198.56

Total Variation

0.203965

1.05042

100.00

210.08

Number of Distinct Categories = 4
© 2001 Six Sigma Academy

Operator

1.1
1.0
0.9
0.8
0.7
0.6
0.5
0.4

57.77

Operator

10

3

37.02

Reproducibility

9

2

68.61

Repeatability

8

(SV/Toler)

Total Gage R&R

7

%Tolerance

(SD)

6

Operator*Part Interaction
UCL=0.8796
Mean=0.8075
LCL=0.7354

0

Source

2

5

1.1
1.0
0.9
0.8
0.7
0.6
0.5
0.4

3

Xbar Chart by Operator

5.37

Part-To-Part

2

0.10

0

Sample Mean

Repeatability

1

4

By Operator

Average

Source

Gage name:
Date of study:
Reported by:
Tolerance:
Misc:

Gage R&R (ANOVA) for Response

What does all this mean?

67

9

10
Graphical Output - 6 Graphs In All
MSA
Gage R&R (ANOVA) for Response
Health
Side
Components of Variation

By Part

Percent

200

1.1
1.0
0.9
0.8
0.7
0.6
0.5
0.4

%Contribution
%Study Var
%Tolerance

100

0
Gage R&R

Repeat

Reprod

Part

Part-to-Part

1

2

3

R Chart by Operator
Sample Range

0.15

1

2

3

0.05

R=0.03833

0.00

LCL=0
0

Operator

1

Average

Sample Mean

© 2001 Six Sigma Academy

7

8

9

10

If only 1 operator,
you won’t get
these graphs

3

Operator*Part Interaction
3

UCL=0.8796
Mean=0.8075
LCL=0.7354

0

6

2

Xbar Chart by Operator
2

5

By Operator
UCL=0.1252

1

4

1.1
1.0
0.9
0.8
0.7
0.6
0.5
0.4

0.10

1.1
1.0
0.9
0.8
0.7
0.6
0.5
0.4
0.3

MSA
Troubleshoot
Side

Gage name:
Date of study:
Reported by:
Tolerance:
Misc:

Operator

1.1
1.0
0.9
0.8
0.7
0.6
0.5
0.4

Part

1
2
3

1

2

3

4

5

6

7

8

9

10

If nested study,
you won’t get this
graph

68
Destructive Test
Gage name:
Date of study:
Reported by:
Tolerance:
Misc:

Gage R&R (Nested) for Response
Components of Variation

By Part (Operator)

Percent

100

%Contribution
%Study Var

18
17
16

50

15
14
13

0
Gage R&R

Repeat

Reprod

Part
Operator

Part-to-Part

6 7 8 9 10 11 12 13 14 15 1 2 3 4 5
Billie
Nathan
Steve

R Chart by Operator
Sample Range

5

Billie

Nathan

By Operator
18

Steve

UCL=4.290

4
3

17
16

2

15
R=1.313

1
0

LCL=0

14
13

Operator

Billie

Nathan

Steve

Xbar Chart by Operator
Sample Mean

18
17

Billie

Nathan

Steve

UCL=17.62

16
15

Mean=15.15

14
13
12

© 2001 Six Sigma Academy

LCL=12.68

Operator by process output interaction
is not applicable

69
Graphical Output Metrics
Chart Output
• Xbar Chart: Shows sampled process output variety
• Reproducibility/bias/sensitivity
• R Chart: Helps identify unusual measurements
• Resolution/repeatability
• Bar Chart: Distinguishes R&R from Process Output to Process
Output
• Components of variation

These are your leading graphical indicators
© 2001 Six Sigma Academy

70
Gage name:
Bar Charts For Components
Date of study:

Gage R&R (ANOVA) for Response

Reported by:
Tolerance:
Misc:

Needs help

Components of Variation

Percent

100

3

%Contribution
%Study Var

2
50
1
0
Gage R&R

Repeat

Reprod

Part

Part-to-Part

1

R Chart by Operator
1

Sample Range

4

2

3

3

UCL=3.915

2

2
1

R=1.198

0

LCL=0
0

1

Operator

1

Xbar Chart by Operator
3
2
1

1

2

Operato
3

2.0

UCL=3.654

Answers: “Where is the variation?”
Mean=1.401

Average

4

ple Mean

3

By

3
Much better

© 2001 Six Sigma Academy

2

1.5

71
Closer Look At The Xbar & R Charts
R Chart: Exposes
gage Repeatability,
resolution & stability

Xbar Chart: Test
of sensitivity,
bias, &
population variety

Xbar: at least 50% outside limits; R chart: in control
© 2001 Six Sigma Academy

72
More R Chart Indicators
R Chart
1

Sample Range

0.005

Randy
2

Rbar too small?

3

0.004
0.003
0.002

UCL=0.001416

0.001

R=4.33E-04
LCL=0

0.000
0

R Chart by Operator
Sample Range

0.15

1

2

Plateaus

3

UCL=0.1252
0.10
0.05

R=0.03833

0.00

LCL=0
0

Both may indicate poor gage resolution
© 2001 Six Sigma Academy

73
Tabular Output Metrics
%Contribution

%Study

%Tolerance

Number of Distinct Categories

© 2001 Six Sigma Academy

74
% Contribution

σ
σ

2

% Contribution =

R&R

2

* 100

TOTAL

• Measurement System Variation (R&R) as a percentage of Total
Observed Process Variation
% Contribution
• Includes both repeatability and reproducibility
9%
1%

© 2001 Six Sigma Academy

75
% Study Variation

σ
% Study Variation =
σ

R &R

* 100

TOTAL

• Looks at standard deviations instead of variance
• Measurement System Standard Deviation (R&R) as a percentage of
Total Observed Process Standard Deviation
% Study
• Includes both repeatability and reproducibility
Variation
30%
10%

© 2001 Six Sigma Academy

76
% Tolerance
Precision to Tolerance P/T
% Tolerance =

5.15 * σR&R
* 100
Tolerance

• Measurement error as a percent of tolerance
• Includes both repeatability and reproducibility
• 5.15 Study Variation = 99%
Acceptance
Criteria

% Tolerance
30%
10%

© 2001 Six Sigma Academy

77
Distinct Categories
 2
σ Process Output 

Number of Distinct Categories = 2 * 
2

σ R &R 


• Number of divisions that the Measurement System can accurately
measure across the process variation
• How well a measurement process can detect process output variationprocess shifts and improvement
Number of Distinct
• Less than 5 indicates Attribute conditions
Categories
5
10
© 2001 Six Sigma Academy

78
Acceptability Summary
Tabular Method

% Contribution

Process
Control
% Study
Variation

Product
Control
% Tolerance

Number of
Distinct
Categories

9%

30%

30%

5

1%

10%

10%

10

Desirable to Have All 4 Indicators Say “Go”
© 2001 Six Sigma Academy

79
Keys To Successful MSA
• Define and validate measurement process
• Identify known elements of the measurement process (operators,
gages, SOP, setup, etc.)
• Clarify purpose and strategy for evaluation
• Set acceptance criteria
• Implement preventive/corrective action procedures
• Establish on-going assessment criteria and schedules

© 2001 Six Sigma Academy

80
Gage R&R - Which % Gage R&R Do I Use?
Depending on how variable your process is as compared to tolerance, your
% Gage R&R values as a percent of Study variation, Tolerance and Process
Variation will be quite different.
For example:
Consider a very stable process with low variability. Percent Tolerance will
indicate that your gauge is very good (low % GRR) with high
discrimination. On the other hand, when compared to process variation, the
GRR will be poor (High % GRR).
As your process improves, you will need to move to more precise gauges if
you wish to “see” decreases in variation due to the measuring system. On
the other hand, if you truly only want to be able to tell when production is
becoming less capable, then you are only interested in the precision of the
gauge as it relates to your customer’s specification. See the Appendix at the
end of this module for further examples
© 2001 Six Sigma Academy

81
Gage R&R, Graphical Output:
Gage name:
Date of study:
Reported by:
Tolerance:
Misc:

Gage R&R (ANOVA) for Measure

Gage #020371
01/01/1998
Six Sigma BB
1.5 mm
Buffalo, NY Plant

Operator

Average

Operator*Part Interaction

Gage name:
Date of study:
Reported by:
Tolerance:
Misc:

1
Gage R&R (ANOVA) for Measure
2
3

1.1
1.0
0.9
0.8
0.7
0.6
0.5
0.4

Gage #020371
01/01/1998
Six Sigma BB
1.5 mm
Buffalo, NY Plant

By Operator

1

2

3

4 5 6
Part ID

7

8

9 10

1.1
1.0
0.9
0.8
0.7
0.6
0.5
0.4

Gage #020371
01/01/1998
Six Sigma BB
1.5 mm
Buffalo, NY Plant

By Part
1.1
1.0
0.9
0.8
0.7
0.6
0.5
0.4
1

2

3

4

5

6

7

8

9 10

Part ID

1

• Operator * Part Interaction:

Gage name:
Date of study:
Reported by:
Tolerance:
Misc:

Gage R&R (ANOVA) for Measure

2

3

Oper ID

• Shows if any given part(s) was hard to manage for any given operator(s)
• Appears as though at least two of the operators had trouble measuring part #10
• What would the ideal graph look like?

• By Operator:

• Shows if any operator(s) had higher or lower readings (on average) than the others
• What would the ideal graph look like?

• By Part:

• Shows the ability of all of our operators to obtain the same readings for each part
• Also shows the ability of our measurement system to distinguish between parts (amount of
overlap)
• What would be the ideal graph look like?

© 2001 Six Sigma Academy

82
Gage R&R, Xbar & R:

• How do we evaluate the X-bar & R-chart?
• Why are the data points out of control on the X-bar and R chart?

Sample Mean

Gage R&R (ANOVA) for Measure

1.1
1.0
0.9
0.8
0.7
0.6
0.5
0.4
0.3

Xbar Chart by Operator
1
2

Gage #020371
01/01/1998
Six Sigma BB
1.5 mm
Buffalo, NY Plant

3

3.0SL=0.8796

X=0.8075
-3.0SL=0.7354

0

0.15
Sample Range

Gage name:
Date of study:
Reported by:
Tolerance:
Misc:

R Chart by Operator
1
2

3

3.0SL=0.1252

0.10
0.05

R=0.03833

0.00

-3.0SL=0.000

0
© 2001 Six Sigma Academy

83
Minitab, Gage Run Chart:
• Generates a run chart of measurements by operator and part id
• Allows us to visualize repeatability and reproducibility within and between
operator and part
• The center line is the overall average of the parts

• STAT > Quality Tools > Gage Run Chart
Runchart of Measure by Part, Operator
1.08
0.98
0.88
0.78
0.68
0.58
0.48
0.38
Part Num 1

2

3

4

5

1.08
0.98
0.88
0.78
0.68
0.58
0.48
0.38
Part Num 6

7

8

9

10

Measure

Measure

1
2
3

© 2001 Six Sigma Academy

84
P/T Ratio Effect on Capability

6.0

Actual Cp

5.0

P/T Ratio

4.0

0%
10%
20%
30%
40%
50%
60%
70%

3.0
2.0
1.0
0.0
0.5 0.6 0.7 0.8 0.9 1.0 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9 2.0

Observed Cp
© 2001 Six Sigma Academy

85
% R&R Vs. Capability

Which Might Need The Most Attention?
Measurement System or Process Capability

Process

%R&R

Obs. Cp

Decision ?

1

10%

0.5

?

2

60%

1.4

?

3

60%

0.5

?

4

70%

6.5

?

© 2001 Six Sigma Academy

86
% R&R Vs. Capability
Which Might Need The Most Attention?
Measurement System or Process Capability

Process

%R&R

Obs. Cp

Decision ?

1

10%

0.5

Capability

2

60%

1.4

Measurement

3

60%

0.5

Maybe Both

4

70%

6.5

Measurement

*Note: Process Step 4
Would improving %R&R really be worth the effort ?
© 2001 Six Sigma Academy

87
Handling Poor Gage Capability:
•
•

•

•
•

If a dominant source of variation is repeatability (equipment), you need
to replace, repair, or otherwise adjust the equipment.
If, in consultation with the equipment vendor or upon searches of
industry literature, you find that the gage technology that you are using
is “state-of-the-art” and it is performing to its specifications, you should
still fix the gage. One temporary solution to this problem is to use
signal averaging (see next page).
If a dominant source of variation is operator (reproducibility), you must
address this via training and definition of the standard operating
procedure. You should look for differences between operators to give
you some indication as to whether it is a training, skill, and/or procedure
problem.
Evaluate the specifications. Are they reasonable?
If the gage capability is marginal (as high as 30% of study variation)
and the process is operating at a high capability (Ppk greater than 2),
then the gage is probably not hindering you and you can continue to
use it.

© 2001 Six Sigma Academy

88
Controlling Repeatability:
• Note: If you want to decrease your gage error take advantage of the standard error
square root of the sample.
• The signal averaging technique uses:

1
n
•
•
•

•

n = the number of repeat measures taken on the same part
the measurement = the average of “n” readings
Example: a gage error of 50% can be cut in half if your point
estimate is an average of 4 repeat measurements

1
= 1/ 2
4

This technique should be used as a short term approach to
perform a study, but you must fix the gage.

© 2001 Six Sigma Academy

x
x x
xx x x
xx x x

Distribution
of Individuals

x
xx
x
x xx
Distribution
of Means

89
Other Statistical Indexes
The Signal-to-Noise Ratio (S/N Ratio) relates the product variation
to the measurement system variation. The S/N Ratio should be as
large as possible.

S

/ N Ratio

=

σ
σ

P
MS

The Discrimination Index provides the number of divisions that the
Measurement System can accurately measure across the part (sample)
variation. If this index is less than 4, then it is inadequate to provide data
for a study. If the index is 4, then it is equivalent to a go/no-go gage. We
would like to see the value of 5 or greater.

σ p
Discrim= 
 σ  * 1.41

 ms 
© 2001 Six Sigma Academy

90
Effects of P/T and S/N Ratios
• The effect of P/T on Cpk

• Large P/T reduces the process Cpk from the true value to
some smaller observed value.
• The effect of P/T on part assessment

• Large P/T increases the probability that we will misclassify
product as defective when it’s really good and vice versa.
• The effect of S/N ratio on control chart sensitivity

• small S/N increases the time before an out-of-control
process is detected by a control chart (refer to X-bar &
range)
• The Effect of the Discrimination Index

• If the Index = 2, only attribute data is available and sample
sizes must be larger.
• If the Index is 5 to 10, then discrimination is finer and
sample sizes can be smaller.
© 2001 Six Sigma Academy

91
Calibration Steps
• Determine if the measurement system needs to be recalibrated

• Determine the minimum number of measurements
needed to make this decision
• Take data and make decision
• If yes, recalibrate system
• Why don’t we just recalibrate?

• Normal variation causes the measurement to be slightly
different each time it is used
• Recalibration should be done only when the
measurements are off by more than the normal variation
• Recalibrating a system when it is not needed can increase
the variability in the measurements

© 2001 Six Sigma Academy

92
Appendix

© 2001 Six Sigma Academy

93
Interpreting Variables GR&R Results
Presented on the following slides are four Variable Gage R&R results - %
Study (P/TV - Precision to Total Variation) and % Tolerance (P/T - Precision
to Tolerance) along with a representative graphical illustration to help
visualize the results and any required action to improve the Measurement
System. Also discussed is the effect of the GR&R on Cp.
– There are an infinite number of GR&R results(combinations of % Study and % Tolerance) use
these four relatively extreme scenarios to help you determine what actions that you need take
given your own results. Remember we are looking for GR&R results of < 10%, although
anything less than 30% is considered barely acceptable (proceed with caution).
– These graphs are not drawn to scale, therefore, when reviewing this information do not
compare the relative size of the histograms between the scenarios, rather, compare the
histograms within the scenario to the Spec Limits. Actual data was not used to create these
histograms.
– These examples assume 10 parts were selected that represent the long-term capability of the
process being investigated. Three operators, 2 trial.
– No assumptions have been made as to the problem with the Measurement System.
– Actual data was not used to calculate the Cp indices. They were visually estimated, but are
assumed reasonable.
© 2001 Six Sigma Academy

94
Scenario #1
15% - % Study
15% - % Tolerance
LSL

USL
Tolerance

Observed
(Total Variation)

Part Contribution
(Part Variation)
5
0

6
0

7
0

8
0

9
0

Gage Contribution
(Precision)

© 2001 Six Sigma Academy

In this example we observe a GR&R result that is
acceptable, where the % Study Variation is the same as
the % Tolerance Variation. The results are the same
because the relative size of the Total Variation -PV
(5.15*sTotal) and the Tolerance- T (USL - LSL) are the
same. Therefore, when we take the P/TV or P/T ratio,
where P is the Precision of the Gage (5.15* sms) it is
well below 30%.
This gage is deemed acceptable, no action is required.
The only action is to improve the Process Capability.
Furthermore, the observed Cp of this process is
probably close to 1, as it appears 6 standard deviations
of the process can fit inside the tolerance once.
Finally, as a result of the acceptable GR&R values the
observed Cp (what we measure) is considered to be the
actual Cp.

95
Scenario # 2
70% - % Study
70% - % Tolerance
USL

LSL

Tolerance
Observed
(Total Variation)

In this example we observe a GR&R where the %
Study Variation is the same as the % Tolerance
Variation, however the results are extremely
unacceptable. The results are the same because the
relative size of the Total Variation -TV (5.15*sTotal)
and the Tolerance- T (USL - LSL) are the same.
Therefore, when we take the P/TV or P/T ratio, where
P is the gage contribution (5.15* sms) it is very much
above 30%. Thus, indicating the Measurement
System can not effectively discern part to part
differences. The impact of a poor GR&R results is to
inflate the variability of the product standard
deviation.

Part Contribution In this example we absolutely need to fix the
(Part Variation) Measurement System!!!
5
0

6
0

7
0

8
0

9
0

Gage Contribution
(Precision)
© 2001 Six Sigma Academy

Finally the observed Cp of this process (using this
poor gage) is probably close to 0.5, as it appears that
only half of the 6 standard deviations of the process
can fit inside the tolerance. The actual Cp is probably
much higher maybe closer to 1 or 1.5. If the
measurement system were improved and deemed
acceptable the observed Cp would reflect actual Cp.

96
Scenario #3
70% - % Study
5% - % Tolerance
LSL

USL

Tolerance

Here we observe a GR&R where the % Study
Variation is extremely unacceptable and the %
Tolerance Variation is very acceptable. How can this
be? In this example the Gage Precision - P (5.15*
sms) compared to the Total Variation - TV (5.15*sTotal)
P/TV is quite large - 70%. However, when we
compare the Gage Precision with to the Tolerance
(USL - LSL) P/T we observe a very acceptable
GR&R - 5%.

Observed
(Total Variation)

Do we need to fix our Measurement System? Well
that depends, if we are still looking for process
improvement then we should fix the measurement
system. If, however, we do not need to improve the
Part Contribution process capability then our measurement system is
(Part Variation)
acceptable.
5
0

6
0

7
0

8
0

9
0

Gage Contribution
(Precision)
© 2001 Six Sigma Academy

In this example our observed Cp is probably close to
2 (99.73% of our process variability close can fit into
our customer tolerance), where as the actual Cp may
be significantly higher. If for some reason the PV
began to increase to the size of the Tolerance then we
would observe our gage as acceptable.

97
Scenario #4
5% - % Study
70% - % Tolerance
LSL

USL

Tolerance
Observed
(Total Variation)
Part Contribution
(Part Variation)

5
0

6
0

7
0

8
0

9
0

Gage Contribution
(Precision)

© 2001 Six Sigma Academy

Here we observe a GR&R where the % Study
Variation is acceptable and the % Tolerance
Variation is very unacceptable. How can this be?
In this example the Gage Precision - P (5.15* sms)
compared to the Total Variation - PV (5.15*sTotal)
P/TV is very small - 5%. However, when we
compare the Gage Precision with to the Tolerance
(USL - LSL) P/T we observe a very large GR&R 70%%.
Do we need to fix our Measurement System? Yes,
we need to fix the measurement system. In this
example, the observed Cp will be the actual Cp and
it is probably about 0.2 to 0.4. However, as we
work our Six Sigma project and reduce the
variability of our KPOV to improve our Process
Capability our % Study Variation will become
worse (% Tolerance, will remain constant). When
our Process Variation is the same size as the
Tolerance, both GR&R’s will be 70% and our
observed Cp will not reflect the actual. Therefore
improvement of the measurement system is
required.

98

Más contenido relacionado

La actualidad más candente

La actualidad más candente (20)

Measurement System Analysis (MSA)
Measurement System Analysis (MSA)Measurement System Analysis (MSA)
Measurement System Analysis (MSA)
 
MSA
MSAMSA
MSA
 
MSA (GR&R)
MSA (GR&R)MSA (GR&R)
MSA (GR&R)
 
Measuremen Systems Analysis Training Module
Measuremen Systems Analysis Training ModuleMeasuremen Systems Analysis Training Module
Measuremen Systems Analysis Training Module
 
Spc
SpcSpc
Spc
 
Dmaic
DmaicDmaic
Dmaic
 
MSA presentation
MSA presentationMSA presentation
MSA presentation
 
Measurement System Analysis
Measurement System AnalysisMeasurement System Analysis
Measurement System Analysis
 
Measurement System Analysis
Measurement System AnalysisMeasurement System Analysis
Measurement System Analysis
 
Measurement system analysis
Measurement system analysisMeasurement system analysis
Measurement system analysis
 
Msa la
Msa laMsa la
Msa la
 
02trainingmaterialformsa (1) 111030062223-phpapp02
02trainingmaterialformsa (1) 111030062223-phpapp0202trainingmaterialformsa (1) 111030062223-phpapp02
02trainingmaterialformsa (1) 111030062223-phpapp02
 
Attribute MSA presentation
Attribute MSA presentationAttribute MSA presentation
Attribute MSA presentation
 
Msa training
Msa trainingMsa training
Msa training
 
6.2 msa-gauge-r&r
6.2 msa-gauge-r&r6.2 msa-gauge-r&r
6.2 msa-gauge-r&r
 
7 qc tools training material[1]
7 qc tools training material[1]7 qc tools training material[1]
7 qc tools training material[1]
 
WEBINAR: Introduction to DMAIC
WEBINAR: Introduction to DMAICWEBINAR: Introduction to DMAIC
WEBINAR: Introduction to DMAIC
 
7 new qc tools
7 new qc tools7 new qc tools
7 new qc tools
 
Spc training
Spc trainingSpc training
Spc training
 
Gage R&amp;R Measurement Systems Analysis Sample Slides
Gage R&amp;R Measurement Systems Analysis Sample SlidesGage R&amp;R Measurement Systems Analysis Sample Slides
Gage R&amp;R Measurement Systems Analysis Sample Slides
 

Similar a Measurement systems analysis v1.1

Ppt total quality management
Ppt total quality managementPpt total quality management
Ppt total quality managementAnitha Velusamy
 
Innovation day 2013 2.5 joris vanderschrick (verhaert) - embedded system de...
Innovation day 2013   2.5 joris vanderschrick (verhaert) - embedded system de...Innovation day 2013   2.5 joris vanderschrick (verhaert) - embedded system de...
Innovation day 2013 2.5 joris vanderschrick (verhaert) - embedded system de...Verhaert Masters in Innovation
 
Essential Statistical Methods for Process & Product Optimization
Essential Statistical Methods for Process & Product OptimizationEssential Statistical Methods for Process & Product Optimization
Essential Statistical Methods for Process & Product OptimizationSafetyChain Software
 
Proven Methods to Abnormality Management and Error Proofing
Proven Methods to Abnormality Management and Error ProofingProven Methods to Abnormality Management and Error Proofing
Proven Methods to Abnormality Management and Error ProofingSafetyChain Software
 
Introduction to Six Sigma.pptx
Introduction to Six Sigma.pptxIntroduction to Six Sigma.pptx
Introduction to Six Sigma.pptxAshweeniTiwari
 
Module 4_Session 5.pptx_Operations Management
Module 4_Session 5.pptx_Operations ManagementModule 4_Session 5.pptx_Operations Management
Module 4_Session 5.pptx_Operations ManagementAnushreeSingh49
 
APM Best Practices - Reliability Added Value
APM Best Practices - Reliability Added ValueAPM Best Practices - Reliability Added Value
APM Best Practices - Reliability Added ValueStork
 
[Pem Zhipeng Xie] project management: lean six sigma
[Pem Zhipeng Xie] project management: lean six sigma[Pem Zhipeng Xie] project management: lean six sigma
[Pem Zhipeng Xie] project management: lean six sigmaPem Zhipeng Xie
 
Applying Lean Sigma Into Validation
Applying Lean Sigma Into ValidationApplying Lean Sigma Into Validation
Applying Lean Sigma Into Validationtjcornish
 
Yellow belt training 68 s
Yellow belt training 68 sYellow belt training 68 s
Yellow belt training 68 sRachit Gaur
 
Six sigma in manufacturing industry
Six sigma in manufacturing industrySix sigma in manufacturing industry
Six sigma in manufacturing industryPrateek Chhajer
 
Lean Six Sigma overview Julian Kalac
Lean  Six Sigma overview Julian KalacLean  Six Sigma overview Julian Kalac
Lean Six Sigma overview Julian KalacJulian Kalac P.Eng
 

Similar a Measurement systems analysis v1.1 (20)

Ppt total quality management
Ppt total quality managementPpt total quality management
Ppt total quality management
 
Innovation day 2013 2.5 joris vanderschrick (verhaert) - embedded system de...
Innovation day 2013   2.5 joris vanderschrick (verhaert) - embedded system de...Innovation day 2013   2.5 joris vanderschrick (verhaert) - embedded system de...
Innovation day 2013 2.5 joris vanderschrick (verhaert) - embedded system de...
 
Essential Statistical Methods for Process & Product Optimization
Essential Statistical Methods for Process & Product OptimizationEssential Statistical Methods for Process & Product Optimization
Essential Statistical Methods for Process & Product Optimization
 
3 16-01 six-sigma
3 16-01 six-sigma3 16-01 six-sigma
3 16-01 six-sigma
 
Visual Management by Operational Excellence Consulting
Visual Management by Operational Excellence ConsultingVisual Management by Operational Excellence Consulting
Visual Management by Operational Excellence Consulting
 
Proven Methods to Abnormality Management and Error Proofing
Proven Methods to Abnormality Management and Error ProofingProven Methods to Abnormality Management and Error Proofing
Proven Methods to Abnormality Management and Error Proofing
 
Introduction to Six Sigma.pptx
Introduction to Six Sigma.pptxIntroduction to Six Sigma.pptx
Introduction to Six Sigma.pptx
 
Six sigma.ppt
Six sigma.pptSix sigma.ppt
Six sigma.ppt
 
Module 4_Session 5.pptx_Operations Management
Module 4_Session 5.pptx_Operations ManagementModule 4_Session 5.pptx_Operations Management
Module 4_Session 5.pptx_Operations Management
 
Tqm
TqmTqm
Tqm
 
APM Best Practices - Reliability Added Value
APM Best Practices - Reliability Added ValueAPM Best Practices - Reliability Added Value
APM Best Practices - Reliability Added Value
 
[Pem Zhipeng Xie] project management: lean six sigma
[Pem Zhipeng Xie] project management: lean six sigma[Pem Zhipeng Xie] project management: lean six sigma
[Pem Zhipeng Xie] project management: lean six sigma
 
Tqm
TqmTqm
Tqm
 
PMINYC Lean
PMINYC LeanPMINYC Lean
PMINYC Lean
 
Applying Lean Sigma Into Validation
Applying Lean Sigma Into ValidationApplying Lean Sigma Into Validation
Applying Lean Sigma Into Validation
 
Six Sigma Orntn
Six Sigma OrntnSix Sigma Orntn
Six Sigma Orntn
 
Yellow belt training 68 s
Yellow belt training 68 sYellow belt training 68 s
Yellow belt training 68 s
 
Six sigma in manufacturing industry
Six sigma in manufacturing industrySix sigma in manufacturing industry
Six sigma in manufacturing industry
 
Six sigma
Six sigmaSix sigma
Six sigma
 
Lean Six Sigma overview Julian Kalac
Lean  Six Sigma overview Julian KalacLean  Six Sigma overview Julian Kalac
Lean Six Sigma overview Julian Kalac
 

Último

ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...JhezDiaz1
 
Choosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for ParentsChoosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for Parentsnavabharathschool99
 
Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Jisc
 
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfVirtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfErwinPantujan2
 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptxSherlyMaeNeri
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxiammrhaywood
 
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptxAUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptxiammrhaywood
 
FILIPINO PSYCHology sikolohiyang pilipino
FILIPINO PSYCHology sikolohiyang pilipinoFILIPINO PSYCHology sikolohiyang pilipino
FILIPINO PSYCHology sikolohiyang pilipinojohnmickonozaleda
 
ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4MiaBumagat1
 
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxAnupkumar Sharma
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...Nguyen Thanh Tu Collection
 
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...Postal Advocate Inc.
 
ACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdfACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdfSpandanaRallapalli
 
Transaction Management in Database Management System
Transaction Management in Database Management SystemTransaction Management in Database Management System
Transaction Management in Database Management SystemChristalin Nelson
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPCeline George
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)lakshayb543
 
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxHumphrey A Beña
 

Último (20)

ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
 
Choosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for ParentsChoosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for Parents
 
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptxYOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
 
Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...
 
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfVirtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptx
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
 
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptxAUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
 
FILIPINO PSYCHology sikolohiyang pilipino
FILIPINO PSYCHology sikolohiyang pilipinoFILIPINO PSYCHology sikolohiyang pilipino
FILIPINO PSYCHology sikolohiyang pilipino
 
ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4
 
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
 
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
 
ACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdfACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdf
 
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptxYOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
 
Transaction Management in Database Management System
Transaction Management in Database Management SystemTransaction Management in Database Management System
Transaction Management in Database Management System
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERP
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
 

Measurement systems analysis v1.1

  • 1. Measurement Systems Analysis (MSA) © 2001 Six Sigma Academy 1
  • 2. Why Measure? • To understand a decision: • Meet standards & specifications • Detection/reaction oriented • Short-term results • Stimulate continuous improvement: • Where to improve? • How much to improve? • Is improvement cost effective? • Prevention oriented • Long-term strategy “If you cannot measure, you cannot improve!” – Taguchi © 2001 Six Sigma Academy 2
  • 3. Measurement System As A Process Material Method Machine Cleanliness Sequence Cleanliness Temperature Temperature Timing Dimension Design Positioning Weight Precision Corrosion Calibration Location Hardness Resolution Set-up Conductivity Stability Density Preparation Wear Compliance-procedure Fatigue Vibration Attention Calculation error Atmospheric pressure Interpretation Speed Lighting Coordination Knowledge-instrument Temperature Dexterity Vision Humidity Cleanliness Environment © 2001 Six Sigma Academy Measurement Error People 3
  • 4. What Is An MSA? Scientific and objective method of analyzing the validity of a measurement system • A “tool” which quantifies: 1. Equipment Variation 2. Appraiser (Operator) Variation 3. The Total Variation of a Measurement System • MSA is NOT just Calibration • MSA is NOT just Gage Repeatability & Reproducibility (R&R) Measurement System Analysis is often a “project within a project” © 2001 Six Sigma Academy 4
  • 5. MSA Relationship To DMAIC Define Measure Analyze Improve Control Measurement Systems Analysis • Quantitative evaluation of tools and processes used in making discrete or variable observations Define Measure Analyze Improve Control Measurement Systems Control • Established, documented, and continuously carried out • Ensures measurement system maintains an acceptable status • Often referred to as “Long Term Gage Plan” © 2001 Six Sigma Academy 5
  • 6. MSA - A Starting Point Before you… • Make adjustments • Implement solutions • Run an experiment • Perform a complex statistical analysis You should… • Validate your measurement systems • Validate data and data collection systems MSA quantifies a major source of process variation © 2001 Six Sigma Academy 6
  • 7. Measurement Systems • Examples • Precision gage • Data collection form • Survey • School entrance exam • Customer satisfaction • On-time delivery report What is your system ? © 2001 Six Sigma Academy 7
  • 8. Types of Measurement System Analysis • Operational Definitions • Walking the Process • Gage R&R • Variable Data • Attribute Data © 2001 Six Sigma Academy 8
  • 9. MSA – Operational Definitions The Measurement System can be validated using Operational Definitions constructed by the Project Team to ensure that all measurement takers completely understand what is expected during the data collection phase. © 2001 Six Sigma Academy 9
  • 10. Developing Operational Definition • Operational definitions are descriptions written in a way that ensures consistent interpretation by different people • The operational definition method of description will be used throughout the DMAIC process © 2001 Six Sigma Academy 10
  • 11. • Operational Definition • The technique of defining an item, process or characteristic using Operational Definitions is an effective way to communicate between Team Members and other people involved in the project. Because Operational Definitions are so effective, the technique is used in a number of locations within the DMAIC process. Remember, to be effective, an Operation Definition must be written in a way that ensures consistent interpretation by different people.CC © 2001 Six Sigma Academy 11
  • 12. General Example – Operational Definitions • Examples of Operational Definitions for data collection: • Record the date that the lease company written notification arrives in the dealership using an MM/DD/YY format. • List any cosmetic preparation in excess of the standard predelivery process required to render the vehicle acceptable for retail consumer sale. • Record the weight of each package of coffee in ounces by pouring the coffee into the filter and placing the filter and coffee on the scale tray. • Record the length of time that coffee remains in the urn by recording the actual time of day each time the Brew button is pressed to recharge the urn. Use 24-hour clock and round to the nearest minute. © 2001 Six Sigma Academy 12
  • 13. MSA – Walking the Process “Walking the Process” is a method of conducting MSA when it is not possible to perform a Gage R&R. © 2001 Six Sigma Academy 13
  • 14. How to “Walk the Process • Develop Operational Definitions for each of the measures to be collected • Train data collectors prior to beginning the data collection activity • Follow the process from beginning to end and monitor the data collection activities to determine if data is being collected properly • Continue walking the process until the data compiled accurately reflects the existing process © 2001 Six Sigma Academy 14
  • 15. Components Of Measurement Error © 2001 Six Sigma Academy 15
  • 16. Components Of Measurement Error • • • • • • Resolution/Discrimination Accuracy (bias effects) Linearity Stability (consistency) Repeatability-test-retest (Precision) Reproducibility (Precision) Each component of measurement error can contribute to variation, causing wrong decisions to be made © 2001 Six Sigma Academy 16
  • 17. Categories Of Measurement Error Which Affect Location Accuracy/ Bias Linearity © 2001 Six Sigma Academy Stability 17
  • 18. Categories Of Measurement Error Which Affect Spread Precision Repeatability © 2001 Six Sigma Academy Reproducibility 18
  • 19. Resolution/Discrimination Resolution? Can change be detected? OK Accuracy/Bias? OK Linearity? OK Stability? OK Precision (R&R)? © 2001 Six Sigma Academy 19
  • 20. Resolution • • • • • • Simplest measurement system problem Poor resolution is a common issue Impact is rarely recognized and/or addressed Easily detected No special studies are necessary No “known standards” are needed © 2001 Six Sigma Academy 20
  • 21. Definitions: • Resolution/Discrimination • Capability to detect the smallest tolerable changes • Inadequate Measurement Units • Measurement units too large to detect variation present • Guideline: “10 Bucket Rule” • Increments in the measurement system should be one-tenth the product specification or process variation © 2001 Six Sigma Academy 21
  • 22. Resolution/Discrimination Poor Discrimination Same process output being measured 1 2 3 4 5 1 Better Discrimination 1 2 3 4 5 1.3 © 2001 Six Sigma Academy 22
  • 23. Resolution Actions • • • • • Measure to as many decimal places as possible Use a device that can measure smaller units Live with it, but document that the problem exists Larger sample size may overcome problem Priorities may need to involve other considerations: • Engineering tolerance • Process Capability • Cost and difficulty in replacing device © 2001 Six Sigma Academy 23
  • 24. Accuracy/Bias Resolution? OK Accuracy/Bias? Measurements are “shifted” from “true” value OK Linearity? OK Stability? OK Precision (R&R)? © 2001 Six Sigma Academy 24
  • 25. Accuracy/Bias Difference between the observed average value of measurements and the master value Master Value (Reference Standard) Master value is an accepted, traceable reference standard © 2001 Six Sigma Academy Average Value 25
  • 26. Accuracy/Bias x x x x xx x x x More accurate © 2001 Six Sigma Academy x x x x xx x x x Less accurate 26
  • 27. Accuracy/Bias Actions • • • • • Calibrate when needed/scheduled Use operations instructions Review specifications Review software logic Create Operational Definitions © 2001 Six Sigma Academy 27
  • 28. Linearity Resolution? OK Accuracy/Bias? OK Linearity? OK Measurement is not “true” and/or consistent across the range of the “gage” Stability? OK Precision (R&R)? © 2001 Six Sigma Academy 28
  • 29. Linearit Observed Average Value Bias No Bias Reference Value Full Range of Gage © 2001 Six Sigma Academy 29
  • 30. Linearity Actions • • • • Use only in restricted range Rebuild Use with correction factor/table/curve Sophisticated study required and will not be discussed in this course © 2001 Six Sigma Academy 30
  • 32. Stability • Measurements remain constant and predictable over time • For both mean and standard deviation Master Value (Reference Standard) • No drifts, sudden shifts, cycles, etc. • Evaluated using control charts Time 1 Time 2 © 2001 Six Sigma Academy 32
  • 33. Stability Actions • • • • Change/adjust components Establish “life” timeframe Use control charts Use/update current SOP © 2001 Six Sigma Academy 33
  • 35. Precision σ2total = σ2product/process + σ2repeatability + σ2reproducibility Master Value Good Precision Poor Precision A B Also known as Gage R&R © 2001 Six Sigma Academy 35
  • 36. Repeatability (A Component Of Precision) • Variation that occurs when repeated measurements are made of the same item under absolutely identical conditions • Same: • Operator • Set-up • Units • Environmental conditions • Short-term © 2001 Six Sigma Academy 36
  • 37. Reproducibility (A Component Of Precision) The variation that results when different conditions are used to make the measurements • Different: • Operators • Set-ups • Test units • Environmental conditions • Locations • Companies • Long-term © 2001 Six Sigma Academy 37
  • 38. R&R Actions Repeatability • Repair, replace, adjust equipment • SOP Reproducibility • Training • SOP © 2001 Six Sigma Academy 38
  • 39. Attribute Measurement Studies © 2001 Six Sigma Academy 39
  • 40. Purpose Of Attribute MSA • • • • • Assess standards against customers’ requirements Determine if all appraisers use the same criteria Quantify repeatability and reproducibility of operators Identify how well measurement system conforms to a “known master” Discover areas where: • Training is needed • Procedures are lacking • Standards are not defined © 2001 Six Sigma Academy 40
  • 41. Attribute MSA - Excel Method • Allows for R&R analysis within and between appraisers • Test for effectiveness against standard • Limited to nominal data at two levels © 2001 Six Sigma Academy 41
  • 42. Attribute MSA Example 5 Attribute Legend (used in computations) 1 Pass 2 Fail Open file MSA-Attribute.xlsOperator #1 Known Population Sample # 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 © 2001 Six Sigma Academy Attribute Pass Pass Pass Pass Fail Fail Pass Pass Fail Pass Pass Pass Pass Pass Fail Pass Pass Pass Fail Pass Pass Pass Pass Pass Fail Pass Pass Pass Fail Pass Try #1 Pass Pass Pass Pass Fail Pass Pass Pass Fail Pass Pass Pass Pass Pass Fail Pass Pass Pass Fail Pass Pass Fail Pass Pass Fail Pass Pass Pass Fail Pass Try #2 Pass Pass Pass Pass Fail Pass Pass Pass Fail Pass Pass Pass Pass Pass Fail Pass Pass Pass Fail Pass Pass Fail Pass Pass Fail Pass Pass Pass Fail Pass DATE: 1/4/2001 NAME: Acme Employee PRODUCT: Widgets BUSINESS: Earth Products Operator #2 Try #1 Try #2 Pass Pass Pass Pass Pass Pass Pass Pass Fail Fail Pass Pass Pass Pass Pass Pass Fail Fail Pass Pass Pass Pass Pass Pass Pass Pass Pass Pass Fail Fail Pass Pass Pass Pass Pass Pass Fail Fail Pass Pass Pass Pass Pass Pass Pass Pass Pass Pass Fail Fail Pass Pass Pass Pass Pass Pass Fail Fail Pass Pass Microsoft Excel Worksheet Operator #3 Try #1 Try #2 Pass Pass Pass Pass Pass Pass Fail Pass Pass Fail Pass Pass Pass Pass Pass Pass Fail Fail Pass Pass Pass Pass Pass Pass Pass Pass Fail Pass Pass Fail Pass Pass Pass Pass Pass Pass Fail Fail Pass Pass Pass Pass Pass Pass Pass Pass Fail Pass Fail Fail Pass Pass Pass Pass Pass Pass Fail Fail Pass Pass 42
  • 43. Scoring Example % APPRAISER SCORE - > 100.00% 78.57% 100.00% % SCORE VS. ATTRIBUTE - > 78.57% 64.29% 71.43% SCREEN % EFFECTIVE SCORE - > 57.14% SCREEN % EFFECTIVE SCORE vs. ATTRIBUTE - > 42.86% • 100% is target for all scores • <100% indicates training required • % Appraiser score = repeatability • Screen % Effectiveness Score = reproducibility • % Score vs. Attribute • individual error against a known population • Screen % Effective vs. Attribute • Total error against a known population © 2001 Six Sigma Academy 43
  • 44. Statistical Report © 2001 Six Sigma Academy 44
  • 45. Statistical Report © 2001 Six Sigma Academy 45
  • 46. Statistical Report Continued © 2001 Six Sigma Academy 46
  • 47. Attribute MSA – MINITAB™ Method • • • • Allows for R&R analysis within and between appraisers Test for effectiveness against standard Allow nominal data with two levels Allows for ordinal data with more than two levels © 2001 Six Sigma Academy 47
  • 48. MINITAB Method - Data Entry • Same data as Excel example • Arranged in multiple columns • Data can also be stacked in single column © 2001 Six Sigma Academy 48
  • 49. Attribute Study - MINITAB Analysis Attribute MSA.mpj Attribute MSA.MPJ Tool Bar Menu > Stat > Quality Tools > Attribute Gage R&R Study © 2001 Six Sigma Academy 49
  • 50. Attribute Study - MINITAB Analysis Continued 1. Select “Single Column” if data is stacked 1. Select “Multiple Columns” if data is un-stacked 2. Enter number of appraisers and trials 3. Enter name of column with “Known” © 2001 Six Sigma Academy 4. Select OK 50
  • 51. Attribute MSA - MINITAB Graphical Output Date of study: 1/03/2001 Reported by: Jose Name of product: XYZ Report Misc: Assessment Agreement Lower variation within appraiser Within Appraiser Appraiser vs Standard Lower variation appraiser vs. standard 100 100 [ , ] 95.0% CI Percent 90 Percent Percent 90 80 80 70 Higher variation within appraiser 70 60 Bob Sue Appraiser Tom Bob Sue Tom Appraiser Higher variation appraiser vs. standard Not included if no “Known” © 2001 Six Sigma Academy 51
  • 52. Attribute MSA – MINITAB Session Window Results Each Appraiser vs. Standard Individual vs. Standard Assessment Agreement Appraiser # Inspected # Matched Percent (%) 95.0% CI Bob 30 28 93.3 ( 77.9, 99.2) Sue 30 29 96.7 ( 82.8, 99.9) Tom 30 24 80.0 ( 61.4, 92.3) # Matched: Appraiser's assessment across trials agrees with standard. Assessment Disagreement Appraiser # Pass/Fail Percent (%) # Fail/Pass Percent (%) # Mixed Percent (%) Bob 1 3.3 1 3.3 0 0.0 Sue 1 3.3 0 0.0 0 0.0 Tom 1 3.3 0 0.0 5 16.7 # Pass/Fail: Assessments across trials = Pass/standard = Fail. Disagreement assessment (repeatability) # Fail/Pass: Assessments across trials = Fail/standard = Pass. # Mixed: Assessments across trials are not identical. Between Appraisers Assessment Agreement # Inspected # Matched Percent (%) 30 24 95.0% CI 80.0 ( 61.4, 92.3) # Matched: All appraisers' assessments agree with each other. All Appraisers vs. Standard Assessment Agreement # Inspected # Matched Percent (%) 30 23 Total agreement (against known) 95.0% CI 76.7 ( 57.7, 90.1) # Matched: All appraisers' assessments agree with standard. © 2001 Six Sigma Academy Between appraisers (reproducibility) 52
  • 53. MINITAB Method - Ordinal Data Entry Ordinal MSA.mtw • Survey data rated on a 1 to 5 scale • Arranged in multiple columns © 2001 Six Sigma Academy Minitab Worksheet 53
  • 54. Attribute Study - Ordinal Select “categories of the attribute data are ordered” Analysis is same as 2 level data © 2001 Six Sigma Academy 54
  • 55. Industrial Attribute MSA Exercise • • • • Evaluate samples supplied by instructor Determine the screen and appraiser scores Interpret the results Recommend actions iGrafx Professional Document attributecircles.MPJ © 2001 Six Sigma Academy 55
  • 56. Variables Measurement Studies © 2001 Six Sigma Academy 56
  • 57. Six Step Variables MSA 1. 2. 3. 4. 5. 6. Conduct initial gage calibration (or verification) Perform trials and data collection Obtain statistics via MINITAB Analyze, interpret results Check for inadequate measurement units On-going evaluation • What would be your long-term gage plan ? © 2001 Six Sigma Academy 57
  • 58. Trials And Data Collection • Generally two to three operators • Generally 5-10 process outputs to measure • Each process output is measured 2-3 times (replicated) by each operator O p e r1 P1 1 2 P2 3 1 2 O p e r2 P3 3 1 2 P4 3 1 2 P5 3 1 2 P1 3 1 2 O p e r3 ... 3 1 2 P5 3 1 2 P1 3 1 2 ... 3 1 2 P5 3 1 Randomization is Critical © 2001 Six Sigma Academy 58 2 3
  • 59. Randomization, Repeats, Replicates Randomization • Runs are made in an arbitrary vs. patterned order • Average out effects of noise or unknown factors • Tradeoff - Invalid results versus slight inconvenience (if any) Repeats • Running more than one sample of a single run • Results are averaged Replication • Running entire experiment in a time sequence • MSA allows for repeatability study © 2001 Six Sigma Academy 59
  • 60. Variables MSA - MINITAB Example Variable MSA.mtw USL=1. Replicate 1 5 LSL=0. 5 © 2001 Six Sigma Academy Replicate 2 Variable MSA.MTW (Randomized order) 60
  • 61. MSA Using MINITAB 10 Process Outputs 3 Operators 2 Replicates USL=1. 5 LSL=0. Replicate 1 Replicate 2 (Randomized order) 5 • Have Operator 1 measure all samples once (as shown in the outlined block) • Then, have Operator 2 measure all samples once • Continue until all operators have measured samples once (this is Replicate 1) • Repeat these steps for the required number of Replicates • Enter data into MINITAB in 3 columns as shown © 2001 Six Sigma Academy 61
  • 62. Manipulate The Data Your data in MINITAB should initially look like this. You will need to STACK your data so that all like data is in one column only Use the commands > Manip > Stack > Stack Blocks of Columns (Stack all Process Outputs, Operators, and Responses so that they are in one column only) Now you are ready to run the macro for data analysis © 2001 Six Sigma Academy 62
  • 63. Stacked And Ready For Analysis Note: c10, c11, c12 are the columns in which the respective data are found IN OUR EXAMPLE. You must have ALL data STACKED in these columns Enter titles © 2001 Six Sigma Academy 63
  • 64. Prepare The Analysis Use the commands > Stat > Quality Tools > Gage R&R Study (Crossed) Each process output measured by each operator OR > Gage R&R Study (Nested) For “destructive tests” where each process output is measured uniquely by each operator © 2001 Six Sigma Academy 64
  • 65. Choose Method Of Analysis Enter Gage Info and Options ANOVA method is preferred • Gives more information © 2001 Six Sigma Academy 65
  • 66. Adding Tolerance (Optional) Upper Specification Limit (USL) Minus Lower Specification Limit (LSL) For this example: USL=1.0 USL=1.0 LSL=0.5 LSL=0.6 USL - LSL=0.50 © 2001 Six Sigma Academy 66
  • 67. MSA Output: Session Window Graphs Two-Way ANOVA Table With Interaction DF SS MS F P 9 2.05871 0.228745 39.7178 0.00000 Operator 2 0.04800 0.024000 4.1672 0.03256 Operator*Part 18 0.10367 0.005759 4.4588 0.00016 Repeatability 30 0.03875 0.001292 Total 59 2.24912 Gage R&R Total Gage R&R 0.004437 100 Gage R&R Repeat Reprod Part Part-to-Part 1 2 3 R Chart by Operator 0.15 Sample Range VarComp By Part 1.1 1.0 0.9 0.8 0.7 0.6 0.5 0.4 %Contribution %Study Var %Tolerance 0 %Contribution Source Components of Variation 200 Percent Part (of VarComp) 10.67 0.001292 3.10 Reproducibility 0.003146 7.56 Operator 0.000912 2.19 Operator*Part 0.002234 0.037164 89.33 Total Variation 0.041602 UCL=0.1252 0.05 R=0.03833 0.00 LCL=0 100.00 1.1 1.0 0.9 0.8 0.7 0.6 0.5 0.4 0.3 Operator 1 StdDev Study Var %Study Var (5.15*SD) (%SV) 0.066615 0.34306 32.66 1 0.035940 0.18509 17.62 0.056088 0.28885 27.50 3 0.030200 0.15553 14.81 Part 1 2 3 1 2 3 4 5 6 7 8 31.11 Operator*Part 0.047263 0.24340 23.17 48.68 Part-To-Part 0.192781 0.99282 94.52 198.56 Total Variation 0.203965 1.05042 100.00 210.08 Number of Distinct Categories = 4 © 2001 Six Sigma Academy Operator 1.1 1.0 0.9 0.8 0.7 0.6 0.5 0.4 57.77 Operator 10 3 37.02 Reproducibility 9 2 68.61 Repeatability 8 (SV/Toler) Total Gage R&R 7 %Tolerance (SD) 6 Operator*Part Interaction UCL=0.8796 Mean=0.8075 LCL=0.7354 0 Source 2 5 1.1 1.0 0.9 0.8 0.7 0.6 0.5 0.4 3 Xbar Chart by Operator 5.37 Part-To-Part 2 0.10 0 Sample Mean Repeatability 1 4 By Operator Average Source Gage name: Date of study: Reported by: Tolerance: Misc: Gage R&R (ANOVA) for Response What does all this mean? 67 9 10
  • 68. Graphical Output - 6 Graphs In All MSA Gage R&R (ANOVA) for Response Health Side Components of Variation By Part Percent 200 1.1 1.0 0.9 0.8 0.7 0.6 0.5 0.4 %Contribution %Study Var %Tolerance 100 0 Gage R&R Repeat Reprod Part Part-to-Part 1 2 3 R Chart by Operator Sample Range 0.15 1 2 3 0.05 R=0.03833 0.00 LCL=0 0 Operator 1 Average Sample Mean © 2001 Six Sigma Academy 7 8 9 10 If only 1 operator, you won’t get these graphs 3 Operator*Part Interaction 3 UCL=0.8796 Mean=0.8075 LCL=0.7354 0 6 2 Xbar Chart by Operator 2 5 By Operator UCL=0.1252 1 4 1.1 1.0 0.9 0.8 0.7 0.6 0.5 0.4 0.10 1.1 1.0 0.9 0.8 0.7 0.6 0.5 0.4 0.3 MSA Troubleshoot Side Gage name: Date of study: Reported by: Tolerance: Misc: Operator 1.1 1.0 0.9 0.8 0.7 0.6 0.5 0.4 Part 1 2 3 1 2 3 4 5 6 7 8 9 10 If nested study, you won’t get this graph 68
  • 69. Destructive Test Gage name: Date of study: Reported by: Tolerance: Misc: Gage R&R (Nested) for Response Components of Variation By Part (Operator) Percent 100 %Contribution %Study Var 18 17 16 50 15 14 13 0 Gage R&R Repeat Reprod Part Operator Part-to-Part 6 7 8 9 10 11 12 13 14 15 1 2 3 4 5 Billie Nathan Steve R Chart by Operator Sample Range 5 Billie Nathan By Operator 18 Steve UCL=4.290 4 3 17 16 2 15 R=1.313 1 0 LCL=0 14 13 Operator Billie Nathan Steve Xbar Chart by Operator Sample Mean 18 17 Billie Nathan Steve UCL=17.62 16 15 Mean=15.15 14 13 12 © 2001 Six Sigma Academy LCL=12.68 Operator by process output interaction is not applicable 69
  • 70. Graphical Output Metrics Chart Output • Xbar Chart: Shows sampled process output variety • Reproducibility/bias/sensitivity • R Chart: Helps identify unusual measurements • Resolution/repeatability • Bar Chart: Distinguishes R&R from Process Output to Process Output • Components of variation These are your leading graphical indicators © 2001 Six Sigma Academy 70
  • 71. Gage name: Bar Charts For Components Date of study: Gage R&R (ANOVA) for Response Reported by: Tolerance: Misc: Needs help Components of Variation Percent 100 3 %Contribution %Study Var 2 50 1 0 Gage R&R Repeat Reprod Part Part-to-Part 1 R Chart by Operator 1 Sample Range 4 2 3 3 UCL=3.915 2 2 1 R=1.198 0 LCL=0 0 1 Operator 1 Xbar Chart by Operator 3 2 1 1 2 Operato 3 2.0 UCL=3.654 Answers: “Where is the variation?” Mean=1.401 Average 4 ple Mean 3 By 3 Much better © 2001 Six Sigma Academy 2 1.5 71
  • 72. Closer Look At The Xbar & R Charts R Chart: Exposes gage Repeatability, resolution & stability Xbar Chart: Test of sensitivity, bias, & population variety Xbar: at least 50% outside limits; R chart: in control © 2001 Six Sigma Academy 72
  • 73. More R Chart Indicators R Chart 1 Sample Range 0.005 Randy 2 Rbar too small? 3 0.004 0.003 0.002 UCL=0.001416 0.001 R=4.33E-04 LCL=0 0.000 0 R Chart by Operator Sample Range 0.15 1 2 Plateaus 3 UCL=0.1252 0.10 0.05 R=0.03833 0.00 LCL=0 0 Both may indicate poor gage resolution © 2001 Six Sigma Academy 73
  • 74. Tabular Output Metrics %Contribution %Study %Tolerance Number of Distinct Categories © 2001 Six Sigma Academy 74
  • 75. % Contribution σ σ 2 % Contribution = R&R 2 * 100 TOTAL • Measurement System Variation (R&R) as a percentage of Total Observed Process Variation % Contribution • Includes both repeatability and reproducibility 9% 1% © 2001 Six Sigma Academy 75
  • 76. % Study Variation σ % Study Variation = σ R &R * 100 TOTAL • Looks at standard deviations instead of variance • Measurement System Standard Deviation (R&R) as a percentage of Total Observed Process Standard Deviation % Study • Includes both repeatability and reproducibility Variation 30% 10% © 2001 Six Sigma Academy 76
  • 77. % Tolerance Precision to Tolerance P/T % Tolerance = 5.15 * σR&R * 100 Tolerance • Measurement error as a percent of tolerance • Includes both repeatability and reproducibility • 5.15 Study Variation = 99% Acceptance Criteria % Tolerance 30% 10% © 2001 Six Sigma Academy 77
  • 78. Distinct Categories  2 σ Process Output   Number of Distinct Categories = 2 *  2  σ R &R    • Number of divisions that the Measurement System can accurately measure across the process variation • How well a measurement process can detect process output variationprocess shifts and improvement Number of Distinct • Less than 5 indicates Attribute conditions Categories 5 10 © 2001 Six Sigma Academy 78
  • 79. Acceptability Summary Tabular Method % Contribution Process Control % Study Variation Product Control % Tolerance Number of Distinct Categories 9% 30% 30% 5 1% 10% 10% 10 Desirable to Have All 4 Indicators Say “Go” © 2001 Six Sigma Academy 79
  • 80. Keys To Successful MSA • Define and validate measurement process • Identify known elements of the measurement process (operators, gages, SOP, setup, etc.) • Clarify purpose and strategy for evaluation • Set acceptance criteria • Implement preventive/corrective action procedures • Establish on-going assessment criteria and schedules © 2001 Six Sigma Academy 80
  • 81. Gage R&R - Which % Gage R&R Do I Use? Depending on how variable your process is as compared to tolerance, your % Gage R&R values as a percent of Study variation, Tolerance and Process Variation will be quite different. For example: Consider a very stable process with low variability. Percent Tolerance will indicate that your gauge is very good (low % GRR) with high discrimination. On the other hand, when compared to process variation, the GRR will be poor (High % GRR). As your process improves, you will need to move to more precise gauges if you wish to “see” decreases in variation due to the measuring system. On the other hand, if you truly only want to be able to tell when production is becoming less capable, then you are only interested in the precision of the gauge as it relates to your customer’s specification. See the Appendix at the end of this module for further examples © 2001 Six Sigma Academy 81
  • 82. Gage R&R, Graphical Output: Gage name: Date of study: Reported by: Tolerance: Misc: Gage R&R (ANOVA) for Measure Gage #020371 01/01/1998 Six Sigma BB 1.5 mm Buffalo, NY Plant Operator Average Operator*Part Interaction Gage name: Date of study: Reported by: Tolerance: Misc: 1 Gage R&R (ANOVA) for Measure 2 3 1.1 1.0 0.9 0.8 0.7 0.6 0.5 0.4 Gage #020371 01/01/1998 Six Sigma BB 1.5 mm Buffalo, NY Plant By Operator 1 2 3 4 5 6 Part ID 7 8 9 10 1.1 1.0 0.9 0.8 0.7 0.6 0.5 0.4 Gage #020371 01/01/1998 Six Sigma BB 1.5 mm Buffalo, NY Plant By Part 1.1 1.0 0.9 0.8 0.7 0.6 0.5 0.4 1 2 3 4 5 6 7 8 9 10 Part ID 1 • Operator * Part Interaction: Gage name: Date of study: Reported by: Tolerance: Misc: Gage R&R (ANOVA) for Measure 2 3 Oper ID • Shows if any given part(s) was hard to manage for any given operator(s) • Appears as though at least two of the operators had trouble measuring part #10 • What would the ideal graph look like? • By Operator: • Shows if any operator(s) had higher or lower readings (on average) than the others • What would the ideal graph look like? • By Part: • Shows the ability of all of our operators to obtain the same readings for each part • Also shows the ability of our measurement system to distinguish between parts (amount of overlap) • What would be the ideal graph look like? © 2001 Six Sigma Academy 82
  • 83. Gage R&R, Xbar & R: • How do we evaluate the X-bar & R-chart? • Why are the data points out of control on the X-bar and R chart? Sample Mean Gage R&R (ANOVA) for Measure 1.1 1.0 0.9 0.8 0.7 0.6 0.5 0.4 0.3 Xbar Chart by Operator 1 2 Gage #020371 01/01/1998 Six Sigma BB 1.5 mm Buffalo, NY Plant 3 3.0SL=0.8796 X=0.8075 -3.0SL=0.7354 0 0.15 Sample Range Gage name: Date of study: Reported by: Tolerance: Misc: R Chart by Operator 1 2 3 3.0SL=0.1252 0.10 0.05 R=0.03833 0.00 -3.0SL=0.000 0 © 2001 Six Sigma Academy 83
  • 84. Minitab, Gage Run Chart: • Generates a run chart of measurements by operator and part id • Allows us to visualize repeatability and reproducibility within and between operator and part • The center line is the overall average of the parts • STAT > Quality Tools > Gage Run Chart Runchart of Measure by Part, Operator 1.08 0.98 0.88 0.78 0.68 0.58 0.48 0.38 Part Num 1 2 3 4 5 1.08 0.98 0.88 0.78 0.68 0.58 0.48 0.38 Part Num 6 7 8 9 10 Measure Measure 1 2 3 © 2001 Six Sigma Academy 84
  • 85. P/T Ratio Effect on Capability 6.0 Actual Cp 5.0 P/T Ratio 4.0 0% 10% 20% 30% 40% 50% 60% 70% 3.0 2.0 1.0 0.0 0.5 0.6 0.7 0.8 0.9 1.0 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9 2.0 Observed Cp © 2001 Six Sigma Academy 85
  • 86. % R&R Vs. Capability Which Might Need The Most Attention? Measurement System or Process Capability Process %R&R Obs. Cp Decision ? 1 10% 0.5 ? 2 60% 1.4 ? 3 60% 0.5 ? 4 70% 6.5 ? © 2001 Six Sigma Academy 86
  • 87. % R&R Vs. Capability Which Might Need The Most Attention? Measurement System or Process Capability Process %R&R Obs. Cp Decision ? 1 10% 0.5 Capability 2 60% 1.4 Measurement 3 60% 0.5 Maybe Both 4 70% 6.5 Measurement *Note: Process Step 4 Would improving %R&R really be worth the effort ? © 2001 Six Sigma Academy 87
  • 88. Handling Poor Gage Capability: • • • • • If a dominant source of variation is repeatability (equipment), you need to replace, repair, or otherwise adjust the equipment. If, in consultation with the equipment vendor or upon searches of industry literature, you find that the gage technology that you are using is “state-of-the-art” and it is performing to its specifications, you should still fix the gage. One temporary solution to this problem is to use signal averaging (see next page). If a dominant source of variation is operator (reproducibility), you must address this via training and definition of the standard operating procedure. You should look for differences between operators to give you some indication as to whether it is a training, skill, and/or procedure problem. Evaluate the specifications. Are they reasonable? If the gage capability is marginal (as high as 30% of study variation) and the process is operating at a high capability (Ppk greater than 2), then the gage is probably not hindering you and you can continue to use it. © 2001 Six Sigma Academy 88
  • 89. Controlling Repeatability: • Note: If you want to decrease your gage error take advantage of the standard error square root of the sample. • The signal averaging technique uses: 1 n • • • • n = the number of repeat measures taken on the same part the measurement = the average of “n” readings Example: a gage error of 50% can be cut in half if your point estimate is an average of 4 repeat measurements 1 = 1/ 2 4 This technique should be used as a short term approach to perform a study, but you must fix the gage. © 2001 Six Sigma Academy x x x xx x x xx x x Distribution of Individuals x xx x x xx Distribution of Means 89
  • 90. Other Statistical Indexes The Signal-to-Noise Ratio (S/N Ratio) relates the product variation to the measurement system variation. The S/N Ratio should be as large as possible. S / N Ratio = σ σ P MS The Discrimination Index provides the number of divisions that the Measurement System can accurately measure across the part (sample) variation. If this index is less than 4, then it is inadequate to provide data for a study. If the index is 4, then it is equivalent to a go/no-go gage. We would like to see the value of 5 or greater. σ p Discrim=   σ  * 1.41   ms  © 2001 Six Sigma Academy 90
  • 91. Effects of P/T and S/N Ratios • The effect of P/T on Cpk • Large P/T reduces the process Cpk from the true value to some smaller observed value. • The effect of P/T on part assessment • Large P/T increases the probability that we will misclassify product as defective when it’s really good and vice versa. • The effect of S/N ratio on control chart sensitivity • small S/N increases the time before an out-of-control process is detected by a control chart (refer to X-bar & range) • The Effect of the Discrimination Index • If the Index = 2, only attribute data is available and sample sizes must be larger. • If the Index is 5 to 10, then discrimination is finer and sample sizes can be smaller. © 2001 Six Sigma Academy 91
  • 92. Calibration Steps • Determine if the measurement system needs to be recalibrated • Determine the minimum number of measurements needed to make this decision • Take data and make decision • If yes, recalibrate system • Why don’t we just recalibrate? • Normal variation causes the measurement to be slightly different each time it is used • Recalibration should be done only when the measurements are off by more than the normal variation • Recalibrating a system when it is not needed can increase the variability in the measurements © 2001 Six Sigma Academy 92
  • 93. Appendix © 2001 Six Sigma Academy 93
  • 94. Interpreting Variables GR&R Results Presented on the following slides are four Variable Gage R&R results - % Study (P/TV - Precision to Total Variation) and % Tolerance (P/T - Precision to Tolerance) along with a representative graphical illustration to help visualize the results and any required action to improve the Measurement System. Also discussed is the effect of the GR&R on Cp. – There are an infinite number of GR&R results(combinations of % Study and % Tolerance) use these four relatively extreme scenarios to help you determine what actions that you need take given your own results. Remember we are looking for GR&R results of < 10%, although anything less than 30% is considered barely acceptable (proceed with caution). – These graphs are not drawn to scale, therefore, when reviewing this information do not compare the relative size of the histograms between the scenarios, rather, compare the histograms within the scenario to the Spec Limits. Actual data was not used to create these histograms. – These examples assume 10 parts were selected that represent the long-term capability of the process being investigated. Three operators, 2 trial. – No assumptions have been made as to the problem with the Measurement System. – Actual data was not used to calculate the Cp indices. They were visually estimated, but are assumed reasonable. © 2001 Six Sigma Academy 94
  • 95. Scenario #1 15% - % Study 15% - % Tolerance LSL USL Tolerance Observed (Total Variation) Part Contribution (Part Variation) 5 0 6 0 7 0 8 0 9 0 Gage Contribution (Precision) © 2001 Six Sigma Academy In this example we observe a GR&R result that is acceptable, where the % Study Variation is the same as the % Tolerance Variation. The results are the same because the relative size of the Total Variation -PV (5.15*sTotal) and the Tolerance- T (USL - LSL) are the same. Therefore, when we take the P/TV or P/T ratio, where P is the Precision of the Gage (5.15* sms) it is well below 30%. This gage is deemed acceptable, no action is required. The only action is to improve the Process Capability. Furthermore, the observed Cp of this process is probably close to 1, as it appears 6 standard deviations of the process can fit inside the tolerance once. Finally, as a result of the acceptable GR&R values the observed Cp (what we measure) is considered to be the actual Cp. 95
  • 96. Scenario # 2 70% - % Study 70% - % Tolerance USL LSL Tolerance Observed (Total Variation) In this example we observe a GR&R where the % Study Variation is the same as the % Tolerance Variation, however the results are extremely unacceptable. The results are the same because the relative size of the Total Variation -TV (5.15*sTotal) and the Tolerance- T (USL - LSL) are the same. Therefore, when we take the P/TV or P/T ratio, where P is the gage contribution (5.15* sms) it is very much above 30%. Thus, indicating the Measurement System can not effectively discern part to part differences. The impact of a poor GR&R results is to inflate the variability of the product standard deviation. Part Contribution In this example we absolutely need to fix the (Part Variation) Measurement System!!! 5 0 6 0 7 0 8 0 9 0 Gage Contribution (Precision) © 2001 Six Sigma Academy Finally the observed Cp of this process (using this poor gage) is probably close to 0.5, as it appears that only half of the 6 standard deviations of the process can fit inside the tolerance. The actual Cp is probably much higher maybe closer to 1 or 1.5. If the measurement system were improved and deemed acceptable the observed Cp would reflect actual Cp. 96
  • 97. Scenario #3 70% - % Study 5% - % Tolerance LSL USL Tolerance Here we observe a GR&R where the % Study Variation is extremely unacceptable and the % Tolerance Variation is very acceptable. How can this be? In this example the Gage Precision - P (5.15* sms) compared to the Total Variation - TV (5.15*sTotal) P/TV is quite large - 70%. However, when we compare the Gage Precision with to the Tolerance (USL - LSL) P/T we observe a very acceptable GR&R - 5%. Observed (Total Variation) Do we need to fix our Measurement System? Well that depends, if we are still looking for process improvement then we should fix the measurement system. If, however, we do not need to improve the Part Contribution process capability then our measurement system is (Part Variation) acceptable. 5 0 6 0 7 0 8 0 9 0 Gage Contribution (Precision) © 2001 Six Sigma Academy In this example our observed Cp is probably close to 2 (99.73% of our process variability close can fit into our customer tolerance), where as the actual Cp may be significantly higher. If for some reason the PV began to increase to the size of the Tolerance then we would observe our gage as acceptable. 97
  • 98. Scenario #4 5% - % Study 70% - % Tolerance LSL USL Tolerance Observed (Total Variation) Part Contribution (Part Variation) 5 0 6 0 7 0 8 0 9 0 Gage Contribution (Precision) © 2001 Six Sigma Academy Here we observe a GR&R where the % Study Variation is acceptable and the % Tolerance Variation is very unacceptable. How can this be? In this example the Gage Precision - P (5.15* sms) compared to the Total Variation - PV (5.15*sTotal) P/TV is very small - 5%. However, when we compare the Gage Precision with to the Tolerance (USL - LSL) P/T we observe a very large GR&R 70%%. Do we need to fix our Measurement System? Yes, we need to fix the measurement system. In this example, the observed Cp will be the actual Cp and it is probably about 0.2 to 0.4. However, as we work our Six Sigma project and reduce the variability of our KPOV to improve our Process Capability our % Study Variation will become worse (% Tolerance, will remain constant). When our Process Variation is the same size as the Tolerance, both GR&R’s will be 70% and our observed Cp will not reflect the actual. Therefore improvement of the measurement system is required. 98

Notas del editor

  1. 48106_MSA rd ai 03/04/14 01:00 PM
  2. 48106_MSA rd ai 03/04/14 01:00 PM
  3. 48106_MSA rd ai 03/04/14 01:00 PM
  4. 48106_MSA rd ai 03/04/14 01:00 PM
  5. 48106_MSA rd ai 03/04/14 01:00 PM
  6. 48106_MSA rd ai 03/04/14 01:00 PM
  7. 48106_MSA rd ai 03/04/14 01:00 PM
  8. 48106_MSA rd ai 03/04/14 01:00 PM
  9. 48106_MSA rd ai 03/04/14 01:00 PM
  10. 48106_MSA rd ai 03/04/14 01:00 PM
  11. 48106_MSA rd ai 03/04/14 01:00 PM
  12. 48106_MSA rd ai 03/04/14 01:00 PM
  13. 48106_MSA rd ai 03/04/14 01:00 PM
  14. 48106_MSA rd ai 03/04/14 01:00 PM
  15. 48106_MSA rd ai 03/04/14 01:00 PM
  16. 48106_MSA rd ai 03/04/14 01:00 PM
  17. 48106_MSA rd ai 03/04/14 01:00 PM
  18. 48106_MSA rd ai 03/04/14 01:00 PM
  19. 48106_MSA rd ai 03/04/14 01:00 PM
  20. 48106_MSA rd ai 03/04/14 01:00 PM
  21. 48106_MSA rd ai 03/04/14 01:00 PM
  22. 48106_MSA rd ai 03/04/14 01:00 PM
  23. 48106_MSA rd ai 03/04/14 01:00 PM
  24. 48106_MSA rd ai 03/04/14 01:00 PM
  25. 48106_MSA rd ai 03/04/14 01:00 PM
  26. 48106_MSA rd ai 03/04/14 01:00 PM
  27. 48106_MSA rd ai 03/04/14 01:00 PM
  28. 48106_MSA rd ai 03/04/14 01:00 PM
  29. 48106_MSA rd ai 03/04/14 01:00 PM
  30. 48106_MSA rd ai 03/04/14 01:00 PM
  31. 48106_MSA rd ai 03/04/14 01:00 PM
  32. 48106_MSA rd ai 03/04/14 01:00 PM
  33. 48106_MSA rd ai 03/04/14 01:00 PM
  34. 48106_MSA rd ai 03/04/14 01:00 PM
  35. 48106_MSA rd ai 03/04/14 01:00 PM
  36. 48106_MSA rd ai 03/04/14 01:00 PM
  37. 48106_MSA rd ai 03/04/14 01:00 PM
  38. 48106_MSA rd ai 03/04/14 01:00 PM
  39. 48106_MSA rd ai 03/04/14 01:00 PM
  40. 48106_MSA rd ai 03/04/14 01:00 PM
  41. 48106_MSA rd ai 03/04/14 01:00 PM
  42. 48106_MSA rd ai 03/04/14 01:00 PM
  43. 48106_MSA rd ai 03/04/14 01:00 PM
  44. 48106_MSA rd ai 03/04/14 01:00 PM
  45. 48106_MSA rd ai 03/04/14 01:00 PM
  46. 48106_MSA rd ai 03/04/14 01:00 PM
  47. 48106_MSA rd ai 03/04/14 01:00 PM
  48. 48106_MSA rd ai 03/04/14 01:00 PM
  49. 48106_MSA rd ai 03/04/14 01:00 PM
  50. 48106_MSA rd ai 03/04/14 01:00 PM
  51. 48106_MSA rd ai 03/04/14 01:00 PM
  52. 48106_MSA rd ai 03/04/14 01:00 PM CircleStandard 1s 2l 3m 4m 5m 6s 7l 8l 9s 10m
  53. 48106_MSA rd ai 03/04/14 01:00 PM
  54. 48106_MSA rd ai 03/04/14 01:00 PM
  55. 48106_MSA rd ai 03/04/14 01:00 PM
  56. 48106_MSA rd ai 03/04/14 01:00 PM
  57. 48106_MSA rd ai 03/04/14 01:00 PM
  58. 48106_MSA rd ai 03/04/14 01:00 PM
  59. 48106_MSA rd ai 03/04/14 01:00 PM
  60. 48106_MSA rd ai 03/04/14 01:00 PM
  61. 48106_MSA rd ai 03/04/14 01:00 PM
  62. 48106_MSA rd ai 03/04/14 01:00 PM
  63. 48106_MSA rd ai 03/04/14 01:00 PM
  64. 48106_MSA rd ai 03/04/14 01:00 PM
  65. 48106_MSA rd ai 03/04/14 01:00 PM
  66. 48106_MSA rd ai 03/04/14 01:00 PM
  67. 48106_MSA rd ai 03/04/14 01:00 PM
  68. 48106_MSA rd ai 03/04/14 01:00 PM
  69. 48106_MSA rd ai 03/04/14 01:00 PM
  70. 48106_MSA rd ai 03/04/14 01:00 PM
  71. 48106_MSA rd ai 03/04/14 01:00 PM
  72. 48106_MSA rd ai 03/04/14 01:00 PM
  73. 48106_MSA rd ai 03/04/14 01:00 PM
  74. 48106_MSA rd ai 03/04/14 01:00 PM
  75. 48106_MSA rd ai 03/04/14 01:00 PM
  76. 48106_MSA rd ai 03/04/14 01:00 PM
  77. 48106_MSA rd ai 03/04/14 01:00 PM
  78. 48106_MSA rd ai 03/04/14 01:00 PM
  79. 48106_MSA rd ai 03/04/14 01:00 PM
  80. 48106_MSA rd ai 03/04/14 01:00 PM
  81. 48106_MSA rd ai 03/04/14 01:00 PM
  82. 48106_MSA rd ai 03/04/14 01:00 PM
  83. 48106_MSA rd ai 03/04/14 01:00 PM
  84. 48106_MSA rd ai 03/04/14 01:00 PM
  85. 48106_MSA rd ai 03/04/14 01:00 PM
  86. 48106_MSA rd ai 03/04/14 01:00 PM
  87. 48106_MSA rd ai 03/04/14 01:00 PM
  88. 48106_MSA rd ai 03/04/14 01:00 PM
  89. 48106_MSA rd ai 03/04/14 01:00 PM
  90. 48106_MSA rd ai 03/04/14 01:00 PM
  91. 48106_MSA rd ai 03/04/14 01:00 PM
  92. 48106_MSA rd ai 03/04/14 01:00 PM
  93. 48106_MSA rd ai 03/04/14 01:00 PM
  94. 48106_MSA rd ai 03/04/14 01:00 PM