Item 2. Verification and Validation of Analytical Methods
1. Verification and Validation
of Methods
Mr Graham Lancaster
Manager/ Director
Environmental Analysis Laboratory
Southern Cross University, Lismore, NSW
2. • Verification is ensuring performance on unchanged
standard procedures.
• Validation is ensuring the same performance where
modifications to the standard or alternative method have
been implemented.
• Only verified/ validated test methods should be approved
for use in the laboratory.
Verification/ Validation of methods
3. • Test methods selected by the laboratory should be
those that have been published internationally or
nationally, and are deemed acceptable to the
relevant technical bodies, whenever such suitable
methods are available (ie. Standard Methods).
• Non-standard methods should be based on
published methods, and modified only to improve
laboratory practices (e.g. safe sample digestion) or
efficacy of the method.
Selection of methods
5. • Published and standard test methods need to be verified
to ensure that the documented performance can be met.
• Validation and verification of test methods should follow
‘NATA General Accreditation Guidance – Validation and
verification of quantitative and qualitative test methods’.
• The accuracy of the method should be determined using
suitable certified reference materials.
Validation/ Verification of methods
6. • Method verification studies are typically less extensive
than those required for method validation.
• If Methods have been previously validated via
collaborative studies (ie. ISO, Aus Standards, or published
in Green Book) then a reduced verification is typically
required.
• Validation is always a balance between costs, risks and
technical possibilities.
Validation/ Verification of methods
7. • The extent of validation required will depend on the status
of the method under consideration and the needs relating
to its intended application.
• Verification under conditions of use is demonstrated by
meeting system suitability specifications based on below:
• blanks, or un-inoculated media (e.g. in microbiology),
to assess contamination;
• laboratory control samples (e.g. spiked samples for
chemistry or positive culture controls for microbiology)
to assess accuracy;
HOW- Validation/ Verification
8. • duplicates to assess precision;
• calibration check standards analysed periodically in
the analytical batch for quantitative analyses;
• monitoring quality control samples, usually through
the use of control charts; and
• Interlaboratory participation (ASPAC) in a
performance testing program provided that the tested
material is representative of the method in terms of
matrix, analytical parameters, concentration level(s),
etc.
HOW- Continued…
12. Validation of Metals and Salts Analysis in
Freshwater, Effluent and Wastewater by ICPMS
Environmental Analysis Laboratory
Southern Cross University, Lismore, NSW
Validation Example
15. Steps of Method Validation
Sensitivity Accuracy Precision
Trueness
Limit of
Detection &
Quantitation
Range
Selectivity Linearity Matrix Effects
Ruggedness
Measurement
Uncertainty
16. The following validation data applies to the determination of
13 metals and 6 salts by ICPMS for the following waters:
• Fresh River Water
• Wastewater
• Effluent
The specific elements are:
• Metals - Silver, Arsenic, Lead, Cadmium, Chromium, Copper,
Manganese, Nickel, Selenium, Zinc, Mercury, Iron and Aluminium
• Salts - Calcium, Magnesium, Potassium, Sodium, Sulfur and
Phosphorus
Validation specific details…
17. • Selectivity refers to the ability of a method to discriminate a particular
analyte in a complex mixture without interference from other
components.
The selectivity for ICPMS analysis is based on the choice of isotope. The isotopes
chosen are derived from Standard Methods for the Examination of Water &
Wastewater, 23rd Ed, 2017 (Table 3125:I) and are the same isotopes as those
recommended by the manufacturer.
Three types of interferences are common to ICPMS. These are:
Physical interferences are generally related to viscosity effects and aerosol
transport from the nebulizer.
Matrix interferences can be defined on how the matrix affects the energy of the
plasma and what happens when the sample enters the vacuum system (space
charge effect).
Spectral interferences can be catergorised into
isobaric (elemental) and polyatomic (molecular).
1. Selectivity
18. • The linearity of an analytical procedure is its ability (within a given
range) to obtain test results which are directly proportional to the
concentration (amount) of analyte in the sample.
Linearity is determined by the software after calibration and is checked by
comparing the instrument response (y as counts) with the concentration
(x as mg/L) of the calibration standard for which the linear model y = Ax +
b is used.
• Routine metals analysis is achieved with linearity using 0.01, 0.1 and 1mg/L
calibration standards.
• Salts (Ca, Mg, K, Na) calibration standards are 20 and 200mg/L, while S and P
are 10 and 100mgL.
• Tables 2-7 Appendix 1 details the instrument response and calibration
coefficient for each calibration standard, and the results
of blanks and standards when analysed as samples
during the validation work.
2. Linearity
20. • Matrix effects are often caused by the alteration of ionization
efficiency of target analytes in the presence of co-eluting compounds
in the same matrix.
Matrix effects may be identified by the use of standard additions and
internal standards. These effects can be corrected for by use of internal
standards.
• The methods routinely used by EAL utilize Scandium, Rhodium and Terbium as
internal standards which have been prepared from 1000ppm stock solution.
• Table 8 lists the concentration of each internal standard and the elements they
are linked to.
• An internal standard response of 75-125% is often acceptable.
• If the response is outside this range samples are to
be diluted by a factor of five and re-analysed.
3. Matrix Effects
21. Table 8: Internal standards and concentrations
routinely utilized by EAL
Internal Standard Elements Concentration
Scandium Ca Mg K Na P Al Cr Fe
Mn Ni Cu Zn
500ppb
Rhodium As Se Ag Cd 50ppb
Terbium Hg Pb 20ppb
22. • Sensitivity is often defined as the lowest analyte concentration that
can be measured with acceptable accuracy and precision.
Instrument Detection Limit (IDL) – is the constituent that produces a signal
greater than three standard deviations of the mean noise level. It can be
determined from seven replicate analysis of a blank sample.
Method Detection Level (MDL) or Level of Detection (LOD) – the constituent
concentration that can be measured and reported with 99% confidence that the
analyte concentration is greater than zero. It can be estimated by analyzing seven
blank samples which have been spiked with the analyte concentration within 2-5
times the IDL. The standard deviation of the analyte is multiplied by three to give
the MDL.
Practical Quantitation Limit (PQL) or Limit of Reporting (LOR) – is determined by
multiplying the MDL by 5 and rounding to the nearest decimal unit.
Table 9 Appendix 1 contains IDL, MDL and PQL
obtained after following the above procedure.
4. Sensitivity
24. • Accuracy is the closeness of agreement between the value which is
accepted either as a conventional true value or an accepted reference
value, and the value found.
• Accuracy was determined by analysis of National Measurement
Institute (Proficiency Study AQA 09-18 Samples 1 and 2), Certified
Reference Materials (CRM’s) CWW-TM-A and CWW-TM-B and
WaterChek Round 109 and 110 proficiency testing samples, the results
of which were compared to the known or certified value.
5. Accuracy and Precision
25. • Precision of a method is the degree of agreement among individual
test results when the procedure is applied repeatedly to multiple
samplings.
• The metals CRM’s and NMI samples were analysed over three (3)
different days (1/2/2011, 21/2/2011 and 7/3/2011) with each sample
matrix analysed on seven (7) consecutive occasions.
Tables 10 – 33 Appendix 1 contain the accuracy and
precision data for the matrices analysed.
5. Accuracy and Precision
27. • Spike recovery criteria is typically (at least) 50% for many validation
guidelines however obviously 100% is ideal.
• Each matrix (except Minerals 1 and 2) was spiked with a multi-element
metals standard at three different concentrations over the working
range of the calibration. The spike concentration for routine metals
was 0.005, 0.200 and 0.800mg/L.
Spiked recovery data is provided in Tables 34 – 56.
6. Spike Recoveries
29. • Trueness refers to 'the closeness of agreement between the average
value obtained from a large series of test results and an accepted value
• Subjective validation is a cognitive bias by which people will consider a
statement or another piece of information to be correct if it has any
personal meaning or significance to them.
• Trueness/Bias can be calculated by summing the difference between
the analytical result and certified result for each analyte.
• An estimate of the average bias can be obtained by comparing test
results, generated in different runs over several days, with the known
value.
7. Trueness/Bias
30. See sensitivity and refer to Table 9.
8. Limit of Detection and Limit of
Quantitation
31. • Range as an interval from the upper to the lower concentration of the
analyte in the sample.
• See linearity and Tables 2-7.
9. Range
32. • Ruggedness is a measure of reproducibility of test results under the
variation in conditions normally expected from laboratory to
laboratory & from analyst to analyst.
• Ruggedness can be determined from assessment of CRM results over a
period of time where the analyses are performed by different
operators under different environmental conditions and where
instrument performance maybe affected by the condition of the torch
and sample introduction system.
• Table 65 contains results for seven (7) groundwater samples analysed by EAL
and mgt Labmark Melbourne (NATA Accreditation number 1261). Results for
these samples compare exceptionally well.
10. Ruggedness
33. Table 65: Comparison data for samples analysed by
mgt Labmark and EAL – November 2010
34. • Measurement uncertainty is a property of measurement result, not of
the method, equipment or laboratory and therefore it is to be
expected that it is assessed only once the result is obtained.
• Measurement Uncertainty (MU) has been calculated using EAL’s
Quality procedure QWORK 17.1 ‘Estimating Uncertainty of
Measurement’.
• This method has been adapted from EURACHEM/CITAC Guide, 2000
and provides an estimate of the combined uncertainty of
measurement.
11. Measurement Uncertainty