Refer to http://ilikai.soest.hawaii.edu/sadcp/projects.html for an inventory and access to the Hawaii Ocean Time-series ADCP data (ftp://ilikai.soest.hawaii.edu/caldwell_pub/adcp/INVNTORY/hots.html) Refer to http://www.soest.hawaii.edu/caldwell/index.html for a complete description of the HOT ADCP data. Look at the "Availability" file for instructions on how to access the data set. Shipboard ADCP Center - Introduction For the past decade, acoustic Doppler current profilers (ADCPs) have become steadily more common aboard the UNOLS, NOAA, and Navy fleets. During the late 1980s, the data quality was limited by lack of continuous Global Positioning System (GPS) coverage and uncertainties in the ship's heading information. However, for the past several years, the quality has improved significantly due to the 24-hour GPS coverage, differential gps techniques, and the advent of GPS heading sensors. With reliable heading and navigation data , absolute currents in the upper ocean are determined. The data provide fine resolution in time (~5 minutes), depth (~10 m), and horizontal distance (~2 km) throughout the duration of a cruise. For a brief explanation of data collection methodology, click here . The growing database allows a fresh view of upper ocean velocity structure on a variety of temporal and spatial scales. The National Oceanographic Data Center (NODC) has been working for several years on a management scheme for this important new data set and is now ready to share the plan with the scientific community and solicit contributions to the shipboard ADCP archive. A group of data producers and users of shipboard ADCP met with data management experts at NODC in May 1992 to discuss the tasks at hand (Firing, 1992). The meeting was convened by Dr. Eric Firing, a professor of oceanography at the University of Hawaii (UH) who is a long standing expert in shipboard ADCP collection, processing, and analysis. By this point in time, many of the difficulties in calibrating and obtaining absolute currents had been overcome and the number of scientific publications using this data was steadily rising. With the sharp increase in installation and attention to the shipboard ADCP in the early 1990s, it became clear that a data management plan was pertinent for centralizing the data set into a well-documented, quality-assured archive and for allowing easy access to the scientific community. Shortly thereafter, the NODC liaison assigned to the TOGA Sea Level Center at UH began collaborating on a part-time basis with Dr. Firing, the NODC data managers, and other ADCP experts in the development of an archive strategy. The primary logistical problem was how to effectively handle the high-density data set consisting of currents and ancillary parameters at the sampling interval with which the data were recorded and processed. It is not merely the volume of data collected on a typical month-long cruise (about 10 Mbytes) that makes this data set complex, but rather the cruise-to-cruise variability (as well as the intra-cruise variability) of the sampling rates and types of ancillary parameters. These parameters include the date-time- group, transducer temperature (and salinity), a variety of diagnostic values, heading information, and navigational data. Moreover, a method of flagging bad values and denoting the depth penetration of reliable data was needed. It was obvious that the traditional flat ASCII file approach was inadequate and the use of a sophisticated processing and data management system was required to facilitate fast, efficient access to the data. A software package called the Common Oceanographic Data Analysis System (CODAS), designed, documented, and maintained by Dr. Eric Firing and associates at UH, became the focus of attention. The system has been used at UH since 1988 and has been distributed to over 30 agencies in 12 countries. In addition to the processing tools, this readily available public domain software provides easy access to the data with a variety of options for averaging, regridding, and selecting only data that meet specified quality criteria. CODAS stores arrays of flags corresponding to the velocity arrays; thus, the original data are not altered by editing. CODAS is a hierarchial data base which uses a "directory file" to keep track of binary "block files." The system was written in standard C language and the package is primarily used on workstations and IBM-compatible PCs. For the binary block files, the software provides translation between machines with different binary numbering conventions and offers a complete ASCII dump. Because of its flexibility, the NODC decided to adopt the CODAS system for the archival of the high-resolution data set. This move advances NODC's goal to be not only an archive center, but to maximize ease of access for the scientific community using the most up-to- date technology. The NODC now archives the high-density shipboard ADCP data as CODAS files and a standard subset of each cruise at hourly and 10 m depth intervals as ASCII files. The CODAS files include current velocities and all ancillary data while the subset includes only the absolute current velocities, transducer temperature, and ship velocity. For analysis purposes, the standard subset is best suited for synoptic and climatological research and the high-density set is valuable for fine-scale studies. The NODC has established the Shipboard ADCP Center (SAC) at UH for the acquisition, review, documentation, archival, and distribution of shipboard ADCP data sets. The activities are overseen by the NODC liaison and the locality takes advantage of close proximity to the ADCP and CODAS experts (Dr. Firing and associates). A network of SUN workstations maintain the archive online and facilitate the archiving steps. Data producers are encouraged to contribute the high-density data sets that have passed the quality control, calibration, and navigation stages. Metadata (information about the data) are vital for the archive. The SAC can provide a guideline for the type of metadata desired. The incoming data sets are placed in the CODAS format if necessary, reviewed, reduced to a standard subset, documented, and backed up. The data producers will be contacted if suspect features are identified or if additional metadata are required. The data sets will be periodically passed on to the NODC headquarters which will act as the final repository, assist in advertisement of data availability and in the encouragement for submissions from producers, and prepare CDROMs for easy distribution of large volumes of the high-density data sets. Firing, E., 1992. Notes from Acoustic Doppler Current Profiler Workshop at the National Oceanographic Data Center, May 14-15, 1992. "unpublished manuscript." Copies available from Mr. P. Caldwell Questions or comments: Mr. Patrick Caldwell, caldwell@iniki.soest.hawaii.edu