School of Ocean and Earth Science and Technology
University of Hawaii at Manoa, Honolulu, Hawaii
JIMAR Contribution 98-315
This report describes the principal contribution of the University of Hawaii School of Ocean and Earth Science and Technology (UH/SOEST) to the Early Detection and Forecast of Tsunamis Project (EDFTP) initiated in 1996 and funded by the Defense Advanced Research Projects Agency (DARPA) as one component of the Pacific Disaster Center (PDC). The purpose of the UH/SOEST effort is to establish and maintain a database of high quality, rapidly-sampled sea level observations from existing shoreline gauges at the coasts of the major Hawaiian Islands. The database is intended for validation and calibration of new generations of models to be used at PDC for determining tsunami wave propagation across the Pacific to Hawaii. The archive will be referred to as the "Archive of Rapidly-Sampled Hawaiian Sea Level (ARSHSL)." ARSHSL is accessed via anonymous ftp to ilikai.soest.hawaii.edu; then go to directory /arshsl/. This archiving activity has already contributed a number of records to the database of the 10 June 1996 tsunami generated near the Andreanov Islands (Eble et al., 1997).
The ARSHSL has the additional purpose of providing a dataset with which to test models of non-seismogenic infragravity wave propagation. In the absence of tsunamis, the background sea level fluctuations (infragravity waves at periods of ~1-10 minutes) appear to be due to sources at continental coastal sites (Filloux et al., 1991). Consequently, these fluctuations could also be used to validate tsunami propagation models, since tsunamis are essentially larger amplitude infragravity waves. Whether the background infragravity wave signals can be used for tsunami model validation is an active research question being pursued under separate NOAA funding.
Establishing this archive required substantial assistance from numerous individuals at the Pacific Tsunami Warning Center (PTWC), PDC and several campuses of the University of Hawaii. All of this assistance, both manpower and access to computers and telephone connections, was provided gratis. Acknowledgments are made throughout the text, and are summarized at the end of the report. We cannot overstate our gratitude for the help we have received to bring this archive to fruition. This project has been a model of how government agencies help advance environmental research while performing their critical missions, and of how University of Hawaii employees generously perform services to others beyond their stated job responsibilities.
Figure 1.(18 kB) Locations of stations from which rapidly-sampled sea level data are being archived. Ocean depths around Hawaii are displayed as well.
Figure 2.(14 kB) Communication links established for acquiring rapidly-sampled sea level data. (Microwave and satellite links were created by PTWC and NOS, respectively.)
Figure 3.(7 kB) A piece of analog data from Honokohau, displaying several problems encountered in the early analog data: multiplicity of periodic spikes; data gaps; and reference level shifts.
Table 1.(16 kB) Hawaiian Island Stations Being Archived
Table 2.(6 kB) Directory Organization of the ARSHSL on Anonymous `ftp' at ilikai.soest.hawaii.edu
Automated procedures have been established for daily acquisition of sea level data from existing gauges maintained by the Pacific Tsunami Warning Center (PTWC) and the National Ocean Service (NOS). Rapidly-sampled (5 second to 2 minute sampling interval) data are downloaded from these gauges and subjected to quality control procedures prior to archiving in an electronically accessible location. Data are acquired either by a direct telephone connection from UH/SOEST to the gauge, or by a telephone connection to PTWC for data that are transmitted to PTWC via microwave link. To avoid long-distance telephone charges to the outer islands of Kauai, Maui and Hawaii, internet links were established to computers with modems on those islands. In one case (Kauai), this required physical placement of a UH/SOEST computer at an appropriate site on the island.
Prior to the archiving activity described here, rapidly-sampled data from the existing gauges were not archived in a consistent manner, if at all. Occasionally, in the event of a significant tsunami, interested professionals might acquire the rapidly-sampled data if they acted quickly. Typically, the most rapidly sampled data is stored at the gauge for only a small number of days before it is overwritten. For the NOS gauges at least, 6-minute samples are transmitted via satellite to several archiving centers, including the UH Sea Level Center (UHSLC). But this data is less useful for studying tsunami signals which can vary significantly in a few minutes.
Our data rescue activity ensures that even the data containing weak tsunami signals, which are valuable for model validation despite their small amplitude, are archived in a consistent manner. As a serendipitous by-product of our daily access of each gauge, we can alert the relevant agencies to gauge failures thus ensuring that the largest possible network of stations will be on-line when a tsunami occurs.
This activity is intended to provide as complete a dataset as possible of tsunami and infragravity wave fluctuations around the Hawaiian Islands for the purpose of verification and calibration of numerical models of tsunami and infragravity wave propagation. Providing the Hawaiian Islands with accurate forecasting of tsunami magnitudes at Hawaiian shores is a long-term goal.
The archive is currently maintained online at UH and is available to interested researchers. ARSHSL is accessed via anonymous ftp to ilikai.soest.hawaii.edu; then go to directory /arshsl/.
Figure 1 and Table 1 summarize the current status of ARSHSL. Table 1 lists the following: station names and positions; an ID for the agency that owns and maintains the gauge (either NOS or PTWC); the kind of sensor; the sampling interval (dt) in seconds; the method we use to access the data; the date the communication link was established for initiation of data acquisition; and, the date the communication link became stable, permitting a clean data stream. Details of the data acquisition are in subsequent sections.
As of March, 2000, twenty separate gauges are supplying data for the archive. The NOS gauges provide both 1-minute and 6-minute data through different data streams, with the exception of Mokuoloe on Oahu and Kawaihae on Hawaii from which we extract only 6-minute data at this time. The modems for these two gauges are attached only to cellular telephones. The cost of downloading the 1-minute sea level data via cellular phone calls is prohibitively expensive.
At two locations, gauges maintained by both NOS and PTWC exist in close proximity. These data provide an a posteriori check on the accuracy of the NOS versus PTWC sensors.
Improving tsunami height predictions will require advances in all aspects of the problem: source waveform estimation, deep ocean propagation, nearshore propagation and amplification, and onshore run-up. Current tsunami warning systems are based on simple observational criteria with little or no real-time simulations attempted. At the PTWC in Honolulu, for instance, an interconnected global network of seismic gauges provides the PTWC with earthquake magnitude and epicenter information within 20 minutes after earthquake occurrence. If the epicenter is within the Pacific, a travel time map based on the expected tsunami wave propagation speeds is calculated. Thereafter, real-time data from gauges on islands between the epicenter and Hawaii are monitored. If the tsunami has a height over a certain threshold at these islands, a tsunami `watch' or `warning' is issued. There is no attempt to calculate numerically the propagating waveform or its eventual magnitude at Hawaii.
Why is this prediction method so rudimentary, relying so little on calculated waveform propagation, when it has been possible for over 20 years to calculate reasonably accurately the magnitude of at least the first couple of wave cycles across the Pacific (Houston, 1978)? To paraphrase an old saying, the Devil's in the details. First, the waveform at the source depends critically on the nature of the earthquake and the structure and amplitude of the seafloor deformation. This information is generally not available within any reasonable time after the earthquake. To offset this lack of information, island sea level gauges and bottom pressure recorders (BPRs) attached to real-time telemetry systems can provide estimates of the tsunami waveform reasonably close to the source if they are distributed appropriately throughout the Pacific. A major gap in the near-source observational network is to be filled by a real-time telemetry system that the Pacific Marine Environmental Laboratory (PMEL) has been funded (in part through EDFTP) to test along the Alaska-Aleutian Seismic Zone (Bernard, 1995; Milburn et al., 1996).
Once the near-source sea level waveform is known, the open ocean propagation of the tsunami, as well as the calculation of the tsunami up to the shore, is straightforward. But variations in seafloor depth, and reflection from boundaries near the source, affect dispersion and refraction, so that, given our mediocre knowledge of ocean bathymetry, it is usually not possible to predict accurately more than the heights of the first two waves of the tsunami wave group, while the highest wave of the group could be the third, fourth or fifth wave. More accurate bathymetry available through the PDC should help improve numerical calculations of the waveforms into shallow water around Hawaii.
The nearshore tsunami waveforms can be described as the modification of the open ocean waveform just offshore by the nearshore transfer function. Calculations of transfer functions around the Hawaiian Islands (e.g., Bernard and Vastano, 1977; Houston, 1978) have shown that there is a strong dependence of the transfer function on frequency, location on the island perimeter, and angle of incidence. Well-sampled sea level data during tsunamis is scarce. And, with the relatively small resources available today, it is impossible to instrument the Hawaiian Islands at the 1 km resolution really needed to accurately define the transfer functions around the islands. Furthermore, observations of the tsunami waveform just offshore of the islands do not generally exist, so that calculation of the transfer functions usually proceeds with a numerically-generated offshore waveform as input. This ultimately confuses the issues of the accuracies of the open ocean propagation calculation versus the nearshore propagation calculation, although the latter is considered the more difficult problem.
Initially, a profitable approach will be simply to use the coastal data as ground truth for the entire numerical calculation from source region to island shore (e.g., Houston, 1978). Currently, more than 20 gauges at the shores of the Hawaiian Is. are available to provide data that can be used to test the accuracy of the modeled sea level heights just described. The purpose of the present archiving activity is to ensure that the rapidly-sampled sea level data from these gauges is systematically archived, preserving weak tsunami signals as well as larger events, for eventual employment in validating numerical models of tsunami propagation.
Unfortunately, it is essentially impossible to instrument the entire coastline at the ideal (better than 1 km) resolution for model validation, while waiting for the infrequent tsunami to arrive, in order to completely test the model predictions. If we had at our disposal a device to generate small tsunamis at regular intervals, then we could deploy a moderate number of gauges successively around the islands, building up a dataset to fully test the spatial variations of the model-predicted tsunami waveform at the shore, given a variety of incidence directions. In fact, such a generator may exist. The sea level data from Hawaii we're archiving will also be used to test the potential value of such `surrogate tsunamis' in validating tsunami propagation models.
Filloux, Luther and Chave (1991) found evidence from a year-long, sparse array of deep ocean bottom BPRs in the North Pacific that the far-infragravity wave band (approximately, 3-20 minutes period, within the tsunami period band) is dominated by oscillations that appeared to be emanating from a single continental coastal location (around Vancouver Is.) from Fall through Spring, 1986-1987. That the energy in the infragravity wave band (0.5 to 3 minutes) is commonly generated by non-linear interactions among shorter period swell at the coast is now well known (e.g., Herbers et al., 1995). And an array of BPRs deployed off California for three weeks in November, 1988, indicated a strong source of 3-8 minute far-infragravity waves around Vancouver Is., with less energetic sources around Pt. Conception, Calif., and the tip of S. America. (S. Webb, personal communication, 1992).
If analyses of existing data from our other BPR arrays in the North Pacific confirms the existence of a single source region (or, at least, a single strong source region for each finite time period), and if these sources have sufficient energy to be observed by rapid-sampling sea level gauges around the Hawaiian Islands, then we have potentially the surrogate tsunamis we need to adequately test open ocean to nearshore propagation of far-infragravity (hence, tsunami) waves.
Under separate funding, the coastal locations and source strengths of far-infragravity wave energy will be determined with existing BPR data from the North Pacific. The sea level data taken from the rapid-sampling Hawaiian gauges will also be analyzed to determine the relative energy levels of the remotely-generated far-infragravity waves versus any local Hawaiian generation of these waves as might occur during heavy swell periods.
An ancillary benefit of this effort to understand the `background' far-infragravity wave band may be to eventually permit prediction and subtraction of this `noise' from other Pacific gauges, thus enabling better detection of tsunami signals from other locations.
The procedures we've established for obtaining rapidly-sampled sea level data from gauges owned and operated by the PTWC and NOS are described in this section. A diagram of the data acquisition communication links we've established is provided in Fig. 2. The elements of Fig. 2 are described throughout this section. For details on our quality control procedures for the acquired data, see section 4.
The data from each dial-up tide gauge are automatically acquired daily. A SOEST computer downloads the data over the telephone lines by calling in to each tide gauge's modem. The instruments located on Oahu are called directly. On the outer islands (Kauai, Maui & Hawaii), a computer within the local telephone region is used as an intermediary. The SOEST computer logs onto the local computer over the Internet network, then dials out to the tide gauges on that island. In this manner all the data are generally acquired without making any expensive long distance telephone calls. Mr. Bruce Turner and Mr. Richard Nygard at PTWC facilitated our establishment of these communication links.
Our access to PTWC tide gauges is governed by the following guidelines, as agreed upon with PTWC: we do not release the phone numbers of the tide gauges to any other persons or agencies; we minimize the number of times we download the data in a given day, ideally restricting it to once per day; and, we access only one station at a time so we do not unduly interfere with PTWC's access to its own instruments. Our access to the NOS instruments comes under the umbrella of the access granted by PTWC, and we follow the same guidelines as for the PTWC instruments.
The procedure is automated with a series of shell scripts. An overall C shell script is run as a scheduled job on the SOEST computer. That script executes a C shell script for each of the tide gauges in turn, which use additional scripts written in Expect to drive C-Kermit sessions and perform the actual downloads. Expect is a superset of Tcl, designed to automate the use of other programs that generally require user interaction. Both Expect and Tcl are distributed by the National Institute of Standards and Technology and are available for public download. C-Kermit is a communications software package distributed by Columbia University.
The shell scripts for downloading the data are written in such a way that the only software needed on the intermediate computer is a C-Kermit program. The intermediate computer is used for remote access only; the controlling scripts and actual data storage occur on the SOEST machine, so the impact on the intermediate computer is minimal. In addition, this keeps the phone numbers of the tide gauges under our control since they do not actually reside anywhere except on the SOEST computer.
Beginning at 1:00 AM, the tide gauges are each queried in turn. The data for the previous 48 hours are downloaded from the instrument and captured by the SOEST computer in a log file. As soon as the download for a station is finished, the raw log files are converted into data files. If the downloading procedure is unsuccessful, then no data file is created. At 10:00 AM each day, a second download is automatically attempted, if necessary. In this case, the only tide gauges queried are those for which no data file was created during the earlier download.
Each day after the two automated acquisition attempts, the sizes of the resulting data files are inspected. The file size has been found to be a reliable initial indicator of the success or failure of a download. If a data file is short, the original log file created during the download is inspected to determine the cause. If there are indications of interrupted data transfer or the cause is unclear, a manual download is attempted. By default, each download attempts to acquire a minimum of 48 hours of data. This provides a failsafe in case no successful download can be made on any one day.
Several factors can prevent a successful download. For the Oahu instruments the only such factors that have proven significant are flushing of a gauge's data buffer (described below) and actual problems with the instruments themselves. For instruments on the outer islands, there are the additional factors of the network connection between the islands, and the access to (or online status of) the intermediary computers and their modems. Problems associated with the use of an intermediate computer can be circumvented by downloading the data via a long distance phone call from the SOEST computer directly to the tide gauge. For the obvious reason that this incurs additional costs, such calls are kept to an absolute minimum and rarely have been necessary. Subsequent to the full implementation and testing of the current downloading procedure, instrument problems and buffer flushing events appear to have been the predominant causes of data loss.
Actual instrumental problems are beyond our control and have been responsible for several breaks in data coverage. As soon as we deduce that a download log file or series of log files indicate a problem with the gauge itself, we contact the agency responsible for maintenance of the instrument.
Instances of an instrument's data buffer flushing have been rare, but have produced several short gaps in data coverage. The data buffer of each instrument is large enough to hold roughly four days of data. It appears that if the instrument is asked to download a number of data points that is greater by some amount than the amount of data the buffer contains at that time, the instrument will mistakenly clear the buffer after the download is completed. Some flushing events appear to have been caused by an agency other than ourselves querying the instrument at some time between our daily downloads.
When a buffer is flushed, the data between our last download and the flushing event are lost. If the flushing event happens shortly before our automated download, our download request for 48 hours of data will again flush the buffer. In that case, however, if the system is left alone no additional data will be lost. The buffer will fill up until our next download request 24 hours later. This period of time appears to be sufficiently long such that a request for 48 hours of data will not cause the instrument to clear the buffer. If, however, additional download attempts for 48 hours of data are made prematurely, the buffer will simply be flushed repeatedly.
It is this data flushing problem that has so far prevented further automation of additional daily download attempts that may be deemed necessary due to undersized daily data files. The procedure for downloading data from the NOS gauges is currently being modified to check for the number of available data points before downloading, in order to avoid instigating a buffer flush. A similar modification to the PTWC procedures is not possible, because the PTWC gauges do not make available the number of records in the buffer, only the number of records since the last download.
The download procedure was first automated for the Oahu instruments. The local phone connection, incurring no long-distance phone charges, permitted experimenting with the download procedure until all failure modes were well defined. The use of Expect for the scripts that handle the actual download interaction with the tide gauges allowed graceful handling of the various failure modes.
Access to the tide gauges on the island of Hawaii was accomplished with the help of Dr. Richard Crowe in the Department of Physics at the University of Hawaii, Hilo. Dr. Crowe has allowed SOEST free access to one of his computers for use as an intermediate host. To facilitate establishment of the communication link, we installed a modem and upgraded the operating system on Dr. Crowe's computer. Phone line access is provided gratis.
For access to the Maui instruments, we installed a modem on an existing computer at the Pacific Disaster Center (PDC). Unfortunately, the first half of 1997 was a period when the computer systems at PDC were in a state of flux as computers were moved into new quarters, security firewalls were established or altered, and communication lines were switched. Each of these events resulted in temporary disruptions of data acquisition, with subsequent loss of data. PDC personnel (especially, Mr. Richard Flagg, Mr. Joe Hoosty, Dr. Dexter Ishii and Mr. Frank Kish) were very responsive to our data collection efforts and re-established our links to Maui's gauges as fast as permissible by the events at hand.
The establishment of a communication link over the network to Kauai was more complicated than for Maui and Hawaii, requiring that SOEST supply and install the intermediate computer. When we were unable to find a suitable computer already online, Kauai Community College (KCC) agreed to let us add a computer to their existing network. Working with a KCC computer specialist, Mr. Tom Kajihara, a UHSLC-owned SPARC 2 workstation was installed on Kauai with network and phone line access supplied by KCC at no cost to this project. This means SOEST has administrative authority over that computer, as opposed to borrowing available space on computers administered by other agents as on Maui and Hawaii.
We are planning a local download, with storage of one or more months of data on the Kauai computer. This will prevent any loss of data in the event of a breakdown of network connections with Kauai of a duration longer than the capacity of the tide gauge's data buffer. To date, the communication with Kauai has been as stable as with the other islands, and little data has been lost due to communication difficulties.
The PTWC downloads data from 7 of its gauges directly via microwave links (Table 1 and Fig. 1). With the help of Dr. Robert Cessaro at PTWC, we were able to establish an automatic procedure, scheduled on a PTWC computer every afternoon, which transfers the data to a SOEST computer via network `ftp'. The analog files are automatically processed the following morning as part of the procedure scheduled to gather the data from the dial-up digital tide gauges previously discussed. The binary files are unpacked and the sea level values are calculated using conversion factors provided by PTWC.
The NOS 6-minute data is transmitted daily from each gauge to the UH Sea Level Center (UHSLC). The majority of the data remains in the UHSLC directories, and 2-3 times per week a script is used to strip out the Primary Water Level data for inclusion in the ARSHSL. This process will soon be added to the scheduled procedure and will automatically occur daily.
In the event that a valid data file is not created for one of the dial-up digital tide gauges, the log file for the automatic download is inspected to determine the cause of the problem. Frequently the problem is a transient phenomenon and an additional manual download is all that is required to recapture the data. Some examples of transient communication failures that have occurred are a busy telephone line to the gauge, an interruption during the phone connection, a network breakdown between the SOEST computer and the intermediate computer on another island, and an intermediate computer or its modem being offline or otherwise unable to respond.
If problems with the gauge itself are indicated, the appropriate agency in charge of the instrument is notified. Since as many as two downloads per day are scheduled in order to acquire a good increment of data, problems with the gauges are discovered almost immediately. On three separate occasions so far, we have been able to clearly identify instrumental problems within days of their occurrence. In the case that a gauge itself is down, there is no way to recover the data.
Several times problems with communication between a gauge and an intermediate computer have resulted when the modem belonging to the local computer got hung in a non-responsive state. In these cases, the problem has been resolved by calling the person in charge of the computer and having him cycle the modem. Network blackouts are obviously beyond our ability to control or predict, however they are rarely more than a few hours in duration. Since we do not own most of the intermediate computers used for downloading the outer island data, and since administrative changes and the online status of those computers are beyond our control, there have been several times these factors have caused problems with acquiring the data.
In all cases where the communication breakdown is in the network connection or the intermediate computer, there is a window roughly four days in duration before the instrument's buffer begins to lose data we haven't downloaded. If the communication problem cannot be corrected within that time frame, the data are downloaded directly to the SOEST computer with a manual download via a long distance phone call. Because of the cost involved, such instances are kept to a minimum and have rarely been necessary. With the system as it stands, only a breakdown of the instrument itself, or a breakdown in telephone communication directly to the instrument that lasts longer than four days, will result in an unrecoverable loss of digital tide gauge data.
There are occasional days where little or no analog data is received from PTWC. The options are limited at this point in terms of recovering such missing data. PTWC does not retain the data on their system after the day of acquisition, although they do backup the data to tape daily. These backup tapes are the only way to recover missing data, and have been used to fill in a number of data gaps. However, creation of the backup tapes is necessarily a relatively low priority activity at PTWC and some backups are missed. Because of concerns over the quality of the data (see section 4.3), a full accounting of the missing days, and recovery attempts from PTWC data tapes, has not yet been made.
Our philosophy has been one of minimal editing, in order not to inadvertently remove what may eventually be realized are real signals. We have not subtracted the tides or removed low-frequency signals from the archived data, since the low-frequency sea level variability, which indicates changes in total water depth, impacts the refraction of infragravity (tsunami) waves near the coast, and therefore is required information for accurate modeling. Contingent upon additional funding, we will produce a complementary archive of rapidly-sampled sea level from which the tides and low-frequency variability has been removed.
Sea level is measured by several different kinds of gauges. A brief description of each instrument type is presented here. More detailed descriptions are presented in Wyrtki et. al. (1988) and Gill et. al. (1992).
The 2-minute digital PTWC data accessed via the phone lines are measured with an incremental shaft encoder (ENC) and recorded with a Limited Automatic Remote Collector (LARC) Handar 550. This incremental shaft encoder measures the sea level using a float connected to a perforated stainless steel tape and balanced with a counterweight in a stilling well. Rotations of the magnetic incremental shaft are converted to electrical impulses and an instantaneous value is stored by the LARC every two minutes.
The 5-second analog PTWC data is collected with a mixture of pneumatic (bubbler) gauges and pressure transducers. A bubbler gauge measures the pressure required to keep a submerged bubble orifice full of gas, either pressurized air or nitrogen. The pressure transducers simply produce an analog voltage that is proportional to the surrounding water pressure. The real-time data streams of both of these instrument types are instantaneously subsampled and recorded at 5 second intervals.
The NOS 1-minute and 6-minute data are both acquired with Next Generation Water Level Measuring System (NGWLMS) instruments utilizing Aquatrak acoustic sensors. These NGWLMS instruments were designed by NOS (Gill et al., 1992) and use the transmission time of a short acoustic pulse to measure the sea level height. The instrument samples once per second. The 1-minute sample is the average of the preceeding 57 one-second data samples (three seconds is required to calculate and store the average). The 6-minute sample is the average of three minutes of data (181 points) centered on the time reported for the average.
All downloaded data are visually inspected. Data plots are examined for large magnitude differences between successive points, as well as time differences, general oddities, and reference level shifts. The daily acquisition of 2 days of data from the dial-up digital gauges permits cross-checking of log and data files to determine the origin of the infrequent bad points and gaps. Often, the bad points or gaps exist in only a single data file, permitting correction with the data from another data file or the log file (see section 3.1 for the distinction between data and log files).
On occasion, erroneous sea level values of 0.000 are numerous in a data file. In this case the data file for the entire daily download is ignored (although the original download log file is kept). Since 2 days of data are downloaded each day, ignoring just one day's download does not cause a gap in the record, since the missing day is automatically recovered from the subsequent download.
Irreparable gaps in data coverage and `unusual' features in the data are documented for each station. This information appears in a file in each station's data directory(ies) in the ARSHSL, along with information about what outliers have been deleted and replaced with the code `NaN' (Not a Number).
Initially, the analog data was extremely noisy, containing spikes, gaps in coverage, sudden changes in the mean signal magnitude by factors up to several times the amplitude range (Fig. 3), as well as occasional duplicate time stamps with differing sea level values. Early attempts at an automated quality control procedure were not successful and determination of editing criteria to produce a final usable product was postponed while the digital tide gauge procedure was being stabilized. Analog data acquisition was continued, but no editing of the data took place.
In December of 1997, PTWC identified intermittent channel switching events occurring in the analog data streams. The crossed channels resulted in data from multiple instruments appearing in each individual instrument's data records and produced the noise, spikes and reference level shifts described above. PTWC corrected the problem and the data quality improved dramatically. Several additional minor refinements during 1998 have further increased the data quality and at this point the data we receive is reasonably clean, requiring only a modest amount of editing to produce a usable product.
The nature and derivation of the time information in the processed analog data requires description. The raw analog data is received in individual binary files nominally containing one hour's worth of data sampled every 5 seconds. The data files contain only sea level values and do not explicitly contain any time information. In practice, occasionally up to several hour's worth of data are broken into a number of files of varying length, sometimes with gaps in data coverage between the files.
The sample time of the first data point is derived from the filename, which contains an epochal time stamp. This filename timestamp does not, however, correspond directly to the actual time of the first data point, but rather to the beginning of a 5 second interval that encompasses the sampling time of the first data point. An ancillary data file accompanies each day's suite of sea level data files and correlates the name of each raw data file with the actual time of the first data point.
The processed analog data files in the ARSHSL are formatted to reflect the frequency and accuracy of the data value timestamps. The Julian day and year values are entered into the processed data files only for the first point in each raw data file. This information is taken directly from the ancillary file when it is available. The remaining date information fields in each processed data file are filled with a marker value of -1. This is done to avoid disguising any unidentified gaps in data coverage with derived (as opposed to original data) timestamps that appear accurate.
Because of occasionally incomplete data suites and/or ancillary files, during processing it is sometimes necessary to use the timestamp in a data file's name to estimate the sampling time of the first data value. If the ancillary file does not contain a given initial sample time, the starting time of the data file in question is calculated from the starting time and the number of data points contained in the preceding file. This calculated time is compared to the time described by the current filename. If the current filename timestamp is between 0 and 5 seconds (one sampling interval) earlier than the estimated time, it is assumed that data collection occurred without interruption, and the calculated time is entered into the processed data file as the time of the first data point in the raw file.
When the calculated time does not correspond appropriately with the time in the current filename, the only time information available for that data file at all is the bin timestamp in the filename. As such, the filename time is written to the processed data file as the sample time of the first data point. In these cases, the timestamp in the processed file is marked with a trailing plus sign ("+") to indicate that the time has been estimated and is inaccurate by up to -5 seconds.
Use of the filename timestamp introduces a degree of time uncertainty that causes the phase of the data of that individual file to lead the true phase by some portion of the sampling interval (5 seconds). Since each data file is nominally one hour in duration, each hour's worth of data potentially has an independent degree of such phase shifting, ranging from none when the first sampling time is explicitly known (the predominant situation) up to 5 seconds if the first data value was sampled at the end of the bin described in the filename.
Editing procedures are similar to those for the dial-up digital data discussed above. The data are visually inspected. Data plots are examined for large magnitude differences between successive points, as well as general oddities and reference level shifts. Gaps in data coverage and `unusual' features in the data are documented for each station. This information appears in a file in each station's data directory(ies) in the ARSHSL, along with information about what outliers have been deleted and replaced with the code `NaN' (Not a Number).
The NOS 6-minute data are handled in a manner similar to the dial-up digital data with regard to inspection and documentation of bad points, gaps and `unusual' features. All downloaded data are visually inspected. Data plots are examined for large magnitude differences between successive points, as well as time differences, general oddities, and reference level shifts. Unfortunately, editing of outliers and gaps must proceed without the benefit of the redundant files available from the dial-up gauges. Irreparable gaps in data coverage and `unusual' features in the data are documented for each station. This information appears in a file in each station's data directory(ies) in the ARSHSL, along with information about what outliers have been deleted and replaced with the code `NaN' (Not a Number).
A unique problem was encountered and solved during editing of the NOS data. Occasionally, two differing sea level values had the same time stamp. We observed that the bad data value is the one accompanied by extra columns of calculated data (e.g., standard deviation) in the original UHSLC data files. In these instances, the data is edited by removal of the suspicious point(s). Sometimes individual bad data is present without a corresponding good data value. The editing of this data then causes gaps in coverage. (All original files are retained intact for future reference.) The Mokuoloe data had excessive occurrences of bad data; the person in charge of that station was notified that a hardware problem was suspected, and the situation was corrected quickly.
ARSHSL is organized as shown in Table 2. The first sub-division is by island, then by station entry in Table 1. Each station sub-directory has an `EditLog' file that lists the data format and units, gaps and unusual features (if any) that have been noticed (and sometimes corrected, as noted in the EditLog file). The data within the station sub-directories is organized by year. Data acquisition for a few stations began in 1996, prior to the receipt of funding, but this data is not included in the archive. Within each year, the data are organized into files containing one day of data each. All data files are ascii format and consist of three columns of data. The first column is the year, the second is yearday (i.e., January 1 is yearday 1), and the third column is sea level height in meters.
ARSHSL is accessed via anonymous ftp to ilikai.soest.hawaii.edu; then go to directory /arshsl/. If data from this archive is used in any public presentation, report, paper, or other publication, please acknowledge the archive by referencing the report Luther et al. (1998), and please send the reference of your work to Dr. Doug Luther (firstname.lastname@example.org). Questions about the archive should also be directed to Dr. Doug Luther.
Contingent upon funding, this archiving activity will be continued into the future, but the number of stations is likely to grow and the mix of dial-up versus analog data will change. As part of its long-range plan, PTWC intends to install a gauge at Port Allen on the south shore of Kauai that will report data to PTWC via microwave link (i.e., analog data with 5-second sampling). NOS recently removed a gauge from that location. PTWC also has plans to convert all its dial-up gauges to microwave transmission in order to avoid the possibility that in the event of a catastrophic tsunami the gauges would be unreachable by telephone because the telephone lines would be clogged by an excited populace.
The ARSHSL could not have been established, and cannot be maintained, without the generous support, in terms of time and facilities, of many people in government agencies and at the University of Hawaii. These people and agencies are acknowledged here. We are very grateful for their support.
The need for this archiving activity was first clarified by Dr. Eddie Bernard, Director of PMEL. He and Dr. Frank Gonzalez, also of PMEL, have provided essential advice and critical liaison with DARPA. This archiving activity was funded by DARPA as a sub-program of Dr. Bernard's Early Detection and Forecast of Tsunamis Project at PMEL. Dr. Doug Gage, NRaD program manager for the DARPA grant, provided strong encouragement and advice. Funding from DARPA was provided to the UH/SOEST through the UH/NOAA Joint Institute for Marine and Atmospheric Research, as part of the UH/NOAA Cooperative Agreement No. NA67RJ0154 (Amendment 8).
Dr. Mark Merrifield, Director of the UHSLC, loaned the use of his experienced personnel early in 1997 to get this activity underway. UHSLC provides the computer to which the sea level data is downloaded and archived, and provided gratis a computer and modem for establishing communication links to the Kauai gauges.
Dr. Mike Blackford, Director of PTWC when this project began, and Dr. Chip McCreery, the current Director of PTWC, have been especially supportive of this effort. They have permitted essential access to their many gauges, and have allowed a script on one of their computers to ftp their analog data to our computer every day. PTWC personnel, Dr. Robert Cessaro, Mr. Richard Nygard and Mr. Bruce Turner, helped establish, then debug, the data streams from the PTWC gauges to UH/SOEST. The Hawaii Civil Defense Agency (HCDA) provides funding for many of the PTWC gauges. We appreciate the interest in this project expressed by Mr. Roy Price, Vice-Director, and Mr. Brian Yanagi of HCDA.
At the Pacific Disaster Center, Mr. Richard Flagg, Mr. Joe Hoosty, Dr. Dexter Ishii and Mr. Frank Kish were all involved in establishing the communication link through one of PDC's computers to Maui's gauges. PDC has provided gratis our access to a local telephone line from PDC.
On the Big Island, Dr. Richard Crowe, Professor of Physics, University of Hawaii, Hilo, provided gratis his computer and phone line to access Hawaii's sea level gauges.
On Kauai, Mr. Tom Kajihara, Kauai Comunity College (KCC), helped us install a computer in the KCC computer laboratory, and provided gratis access to a local phone line.
ANAL - analog data
AQT - Aquatrak acoustic sea level height sensor
ARSHSL - Archive of Rapidly-Sampled Hawaiian Sea Level
BPR - bottom pressure recorder
BUB - bubbler sea level height gauge
DARPA - Defense Advanced Research Projects Agency
EDFTP - Early Detection and Forecast of Tsunamis Project
ENC - Handar encoder sea level height gauge
GOESW - Geostationary Operational Environmental Satellite - West
HCDA - Hawaii Civil Defense Agency
KCC - Kauai Community College
LARC - Limited Automatic Remote Collector
NGWLMS - Next Generation Water Level Measuring System
NOAA - National Oceanic and Atmospheric Administration
NOS - National Ocean Service
NRaD - Navy Research and Development Agency
PMEL - Pacific Marine Environmental Laboratory
PDC - Pacific Disaster Center
PRS - NOS pressure transducer gauge
PTWC - Pacific Tsunami Warning Center
SOEST - School of Ocean and Earth Science and Technology
UH - University of Hawaii
UHSLC - University of Hawaii Sea Level Center
Bernard, E.N., 1995: "Reducing tsunami hazards along U.S. coastlines," in Perspectives on Tsunami Hazard Reduction, 1996: Proceedings of the 1995 IUGG Tsunami Symposium, Kluwer Academic Publishers.
Bernard, E.N. and A.C. Vastano, 1977: Numerical computation of tsunami response for island systems. J. Phys. Oceanogr., 7, 389-395.
Eble, M. C., J. Newman, J. Wendland, B. Kilonsky, D. Luther, Y. Tanioka, M. Okada, and F. I. Gonzalez, 1997: The 10 June 1996 Andreanov tsunami database. NOAA Data Report ERL PMEL-64, 101 pp.
Filloux, J. H., D.S. Luther an A.D. Chave, 1991: Long-term seafloor measurements of water pressure: Normal modes and infragravity waves. XXth General Assembly, IUGG, Vienna, Austria, 11-24 August, 1991.
Gill, S. K., T. N. Mero, and B. B. Parker, 1992: "NOAA operational experience with acoustic sea level measurement," in Joint IAPSO-IOC Workshop on Sea Level Measurements and Quality Control, Paris, 12-13 October, 1992. UNESCO Intergovernmental Oceanographic Commission Workshop Report No. 81.
Herbers, T. H. C., S. Elgar and R. Guza, 1995: Generation and propagation of infra-gravity waves. J. Geophys. Res., 100, 24863-24872.
Houston, JR, 1978: Interaction of tsunamis with the Hawaiian Islands calculated by a finite-element numerical model. J. Phys. Oceanogr., 8, 93-102.
Luther, D. S., S. R. Sweet, R. Whitmire, B. J. Kilonsky, and H. Dail, 1998: Archive of Rapidly-Sampled Hawaiian Sea Level (ARSHSL). SOEST Technical Report 98-01, 19 pp.
Milburn, H. B., A. I. Nakamura, and F. I. Gonzalez, 1996: Real-time tsunami reporting from the deep ocean. Proceedings of the Oceans 96 MTS/IEEE Conference, 23-26 September 1996, Ft. Lauderdale, Fl., 390-394.
Wyrtki, K., K. Constantine, B. J. Kilonsky, G. Mitchum, B. Miyamoto, T. Murphy, S. Nakahara and P. Caldwell, 1988: The Pacific island sea level network. UH/NOAA JIMAR Contribution No. 88-0137.