Hawaii Ocean Time-series (HOT)
in the School of Ocean and Earth Science and Technology at the University of Hawai'i

QUALITY CONTROL/QUALITY ASSURANCE PROGRAM

« previous next »



SUMMARY: The primary objective of the HOT program is to collect and interpret biological,chemical and hydrographic time-series data. In order to provide accurate and reliable data, to the oceanographic community, the JGOFS component of HOT has established a quality control/quality assurance (QC/QA) program that is designed to assess and maintain data quality. These QC/QA procedures encompass all aspects of the program from sample collection through data reporting.


1. Introduction

Our QC/QA program is designed to ensure that data of the highest quality are obtained from the HOT program. A fundamental component of this QC/QA program is the documentation of the detailed analytical procedures that are presented in the following chapters. These procedures are consistently applied in our laboratory analyses. This chapter describes the HOT program QC/QA procedures that are independent of the specific analytical protocols presented in subsequent chapters. These QC/QA procedures include field sampling, analytical facilities and instrument maintenance, interlaboratory comparisons and data reporting.

2. Precision and Accuracy

The precision and accuracy of each analytical procedure is discussed in the appropriate chapter. Accuracy is a measure of how close an analyzed value is to the true value. In general, the accuracy of an analytical method is determined by the use of calibrated, traceable reference standards. However, it is important to bear in mind that the assessment of accuracy based upon primary standards can be misleading if the standards are not prepared in seawater because many of our chemical determinations exhibit matrix (i.e., salt) effects. In addition, it must be recognized that most of the HOT program core measurements (e.g., dissolved oxygen, pH, pCO2, primary production, etc.), do not have readily available reference materials.

Precision is a measure of the variability of individual measurements (i.e., the analytical reproducibility) and in the HOT program two categories of replicates are measured; field and analytical replicates. Analytical replication is the repeated analysis of a single sample and is a measure of the greatest precision possible for a particular analysis. Field replication is the analysis of two or more samples taken from a single sampling bottle and has an added component of variance due to subsampling, storage, and natural within sample variability. The variance of field and analytical replicates should be equal when sampling and storage have no effect on the analysis (assuming the analyte is homogenously distributed within the sampling bottle). Therefore, the difference between field and analytical replicates provides a first order evaluation of the field sampling procedure. Higher level variance due to sample bottle replication (multiple bottles on same cast or multiple casts) is not well-resolved in the current HOT sampling protocols.

It is apparent from these definitions that precision and accuracy are not necessarily coupled. An analysis may be precise yet inaccurate, whereas the mean of a variable result may be quite accurate. Therefore, precision and accuracy must be evaluated independently.

3. Overview of the Quality Control/Assurance Program

The basic framework of the HOT-JGOFS QC/QA program addresses field sampling, laboratory facilities, laboratory analysis and data reporting. Quality control in the field is primarily attained by utilizing modern sampling equipment that is properly maintained. The quality of field and laboratory instruments is preserved with appropriate instrument maintenance, periodic calibration and careful documentation procedures. Laboratory analysis QC/QA is evaluated on the basis of periodic review of methodology, variance evaluations (control charts), reference materials (where available) and inter- and intralaboratory comparisons. Quality control procedures associated with data reporting include sample documentation, tracking and evaluation of analytical results, relative to sample documentation, and comparison of results to historical values.

4. Field Sampling and Strategy

Specific aspects of the time-series field sampling strategy have been presented (see Chapter 2), and will not be repeated here except to emphasize key aspects of our QC/QA program. Station KAHE (see map in Chapter 1) serves as a representative coastal site for the collection and interpretation of long-term environmental data and as an equipment test station. At Station KAHE, an initial cast is performed using only a weight to test the winch and to inspect the condition of the hydrowire. This test cast is followed by a CTD-rosette cast to 1000 m with a full complement of 24 12-liter water bottles. The latter serves to test the CTD, pylon and deck box, to collect water column samples and to provide a hands-on opportunity for novice members of the scientific party to participate in the deployment and retrieval of the rosette and the collection of water samples. Ideally, samples are collected and processed exclusively by experienced personnel. However, the HOT program encourages graduate and undergraduate participation and endeavors to combine marine research with marine science education. Consequently, we conscientiously schedule an at-sea training session to ensure that the procedures followed are identical from month to month.

During each HOT cruise, at least 20% of the samples are routinely collected in duplicate or triplicate to evaluate field precision. In addition, salinity samples are drawn and on-deck sample temperatures (for those casts where oxygen samples are drawn) recorded from each water bottle sampled. Both procedures are useful for the identification of sample mistrips (i.e., the collection of water from a depth other than intended).

The collection of representative samples is paramount to a successful time-series program and is contingent upon the use of appropriate sampling equipment which is well-maintained and operationally sound. A field sampling equipment maintenance program is administered by our Marine Technician. The program consists of a documented inspection of field equipment at regular intervals. A record of repairs, modifications and any other pertinent information is also maintained. In addition, diagrams outlining all sampling equipment and assembly procedures for sediment trap and in situ primary productivity arrays, radio direction finder tracking equipment, Argos satellite transmitter and on-deck incubation system have been drawn and are updated as necessary.

Sample collection quality control measures are based upon the concept of applying time-tested oceanographic sampling techniques in a standardized and coordinated manner supervised or conducted by experienced personnel (details of each sampling procedure are outlined in the following chapters). Specific sampling data are recorded on log sheets at the time of collection, identifying the type of samples collected, cruise, station, time, cast number, sample number and any other pertinent metadata. These "metadata," along with copies of the CTD console log and property vs. depth plots are retained in the appropriate HOT cruise notebook. Records are maintained to identify sample tracking from collection through analysis and data reporting. Any problems associated with a particular sample are noted on the appropriate log sheet or data file and are evaluated relative to routine quality control proceures (Fig. 2).

5. Analytical Facilities

All analyses are conducted at the University of Hawaii at Manoa in modern, well-equipped research laboratories. Specialized analytical equipment used in the JGOFS project include: Packard model #4640 liquid scintillation counter, UIC model #5011 coulometer, Biospherical microprocessor-controlled ATP photometer, Perkin-Elmer model #2400 carbon/nitrogen analyzer, Technicon autoanalyzer and accessories, automatic Winkler titration system consisting of Brinkmann Dosimat 665, Orion EA 940 ion analyzer and IBM compatible computer, Guildline Autosal model 8400A salinometer, Antek model #720 nitrogen oxides analyzer, Zeiss epifluorescence, phase contrast and inverted microscopes, Coulter Epics dural-laser flow cytometer, Ionics model #555 carbon analyzer, Beckman DU-640 ultraviolet-visible light spectrophotometer, Spectra- Physics model SP8800 HPLC equipped with Waters model 440 absorbance and model 470 fluorescence detectors, Waters model 990 photodiode array spectrometer, Turner model AU-10 fluorometer, Perkin-Elmer model #LS-5 fluorescence spectrophotometer with data station, Cahn C-31 microbalance and LAL model #5000 optical analyzer.

In addition to the above, the JGOFS laboratories are well equipped with standard laboratory equipment, including: fume hoods, analytical and toploading balances, centrifuges, freezers, refrigerators, volumetric glassware, pipettes, muffle furnaces, pH meters, computers and other general laboratory equipment and glassware. The facilities are maintained to provide optimum conditions for a wide scope of analytical procedures. Quality control measures include service contracts (balances and selected equipment), verification of performance through the use of calibration curves, standards bracketing samples, wavelength verification and calibration, measurement of secondary standards, utilization of NIST Class S weights, NIST traceable thermometers and analysis of appropriate known and unknown reference samples. Instrument operating, service, and calibration manuals are retained and the calibration, repair and service history of JGOFS-utilized equipment documented and retained for use by laboratory personnel.

6. Chemicals and Reagents

All chemicals and reagents used in our routine sample analyses are ACS quality, or better. Incoming chemicals are marked with date received and recorded on the chemical inventory sheet. Distilled deionized water (DDW) is used in the preparation of solutions and the chemical resistivity of the DDW is continuously monitored to confirm purity. New chemicals or reagents are compared to previous reagent performance and are discarded when: (1) the expiration date has elapsed or (2) when the analytical performance is deemed inadequate.

7. Laboratory Analysis

The specific analytical methodologies outlined in the subsequent chapters have resulted from extensive methods evaluation (Table 1). These procedures are conducted by experienced personnel familar with oceanographic laboratory protocols and instrumentation. Where applicable, analytical runs include a series of standards bracketing samples, replicates (>20%), analysis of reference (control) samples, as well as procedural, reagent, refractive index, salt, dilution and time-zero blank corrections. All analytical results are documented and original hard copies are archived in the appropriate HOT notebooks.

Sample analysis quality assurance relies heavily on replicate analysis and use of certified reference standards as determinants of precision and accuracy, respectively. Replicates are the primary determinants of variance and, as discussed previously, can be divided into two categories; field replicates, providing a measure of sampling and natural variability, and analytical replicates, providing a measure of the analytical precision. As previously stated, at least 20% of the samples are collected and processed as field replicates. An additional number of analytical replicates are analyzed to evaluate analytical variance. Where appropriate, internal standards are analyzed on selected samples. When necessary additional quality control measures may include matrix matching, standard additions and comparison of results with independent methodologies. When available, traceable certified reference standards are used to assess the accuracy of each set of determinations.

Analogous to the field preventative maintenance program is an instrumental service and calibration program. This program identifies and documents service intervals for balances and other specialized equipment. Because the analytical equipment used in the JGOFS program experiences regular use the performance is routinely evaluated. If a problem develops, sample analysis is terminated until normal operation is restored. In addition, the dependency of many analytical procedures on the proper and accurate operation of the analytical balance is recognized and evaluated by weighting secondary standards during periods of use and periodic comparison to NIST traceable class S weights. All calibration, repair, modifications and service histories are maintained in written logs.

Where relatively small temporal changes are expected in the ambient concentrations of dissolved and particulate analytes of interest to the HOT program scientists, it is of paramount concern to quantify the temporal performance of resulting analytical results in terms of the accuracy of primary standards. This can be achieved by routinely analyzing the same reference standard over a relative long period of time. These need not be certified reference standards, however, the analyte must be temporally stable (ideally greater than 1 year) in the sample matrix. We have found that frozen (-20 °C) unfiltered inorganic nutrient samples (for inorganic nutrient analysis), dried pulverized net plankton (particulate carbon, nitrogen and phosphorus), and a (-20 C) stored pure chlorophyll a standard (fluorometric analysis) are adequate for assessing temporal variability.

A great asset to any analytical quality control and assurance program is participation in inter-laboratory programs. Interlaboratory programs allow an independent evaluation of analytical quality and performance relative to participating laboratories. The HOT program has had the opportunity to participate in the following intercomparison studies:

  • NSF-JGOFS intercalibration of plant pigments coordinated by R. Bidigare and M. C. Kennicutt of Texas A&M

  • ICES Marine Chemistry Working Group-sponsored intercomparison of inorganic nutrient analyses coordinated by D. Kirkwood, United Kingdom

  • monthly total CO2 intercomparison with C. D. Keeling, Scripps Institution of Oceanography

  • periodic total CO2 intercomparison with P. Quay, University of Washington

  • periodic dissolved oxygen intercomparison with S. Emerson, University of Washington

  • periodic dissolved oxygen intercomparison with Omar Calvario-Martinez, Instituto de Ciencias del mar y Limnologia, Estacion Mazatlan

  • periodic salinity intercomparisons with C. Collins, Naval Postgraduate School

  • methods intercomparison with Bermuda da Atlantic Time-Series, Bermuda Biological Station for Research

  • DON intercomparison coordinated by C.S. Hopkinson, Jr. Marine Biological Laboratory, Woods Hole, Massachusetts

  • DOC intercomparisons coordinated by J. Hedges University of Washington and J. Sharp, College of Marine Studies, University of Delaware

8. Data Evaluation and Reporting

Data evaluation and reporting are the final steps in the quality control process and comprise an essential part of the quality assurance program. Here the data are reviewed in the context of the entire sample collection, storage and analytical process. Discrepancies or anomalous results are noted at various stages of the analytical process (Fig. 2) and the final data evaluated for correctness of analysis by plotting the analyte profile vs. depth and density and investigating those points outside the historic data envelope. Data outside the historic data envelope are not automatically flagged as "bad," but rather investigated for the source of the problem through the sample documentation. If the problem remains unidentificable the data are flagged "questionable" if the values are outside the 95% confidence interval (greater than 2 standard deviations from the historical mean), and "good" if within this error envelope. If a source for the discrepancy is discovered the data are flagged "bad." At this point all data inside the historic envelope are flagged "good" and together with the "questionable" data added to the historic data set. Finally, all the data are summarized and reported in our annual data report along with the appropriate quality flag.

9. References

  • American Public Health Association. 1989. Standard Methods for the Examination of Water and Wastewater, 17th edition. American Public Health Association, Washington, DC.

  • Dux, J.P. 1990. Handbook of Quality Assurance for the Analytical Chemistry Laboratory, New York: Van Nostrand Reinhold, 203 pages.

  • Dickson, A.G. 1991. Measuring Oceanic CO2: Progress on Quality Control. U.S. JGOFS Newsletter vol. 3, number 2, pp. 4-5.