Long-term Real-time Coastal Ocean Observation Networks

Scott M. Glenn1, Tommy D. Dickey2, Bruce Parker3 and William Boicourt4

1Institute of Marine and Coastal Sciences
Rutgers University
New Brunswick, NJ 08901-8521

2Ocean Physics Laboratory
University of California Santa Barbara
Goleta, CA 93117

3Coast Survey Development Laboratory
National Ocean Service, NOAA
Silver Spring, MD 20910

4Horn Point Environmental Laboratory
University of Maryland
Cambridge, MD 21613
 

October 22, 1999


  1. Introduction
Oceanographers are well acquainted with the challenges of working in an undersampled ocean. Observations are often sparse, difficult or expensive to acquire, and may not even be available to the sea-going scientist until they have physically reached their study site by boat.  Much is often left to chance if the scientist's interests lie in the study of episodic events that may be short lived in time and distributed in space. At the other end of the spectrum, scientists studying long-term trends, such as the coastal and estuarine response to global climate change or local human influences, must be able to separate natural variability from anthropogenic effects. This can only be accomplished through the analysis of long-term time series of key parameters obtained from permanent observation stations.

Our ability to both capture short-lived episodic events and resolve long-term trends in the coastal ocean is rapidly improving through technological advances in sensors and observation platforms envisioned in the early 1990's and brought to fruition later in the decade.   Observation networks consisting of remote sensing, stationary, movable and drifting platforms are being assembled throughout the country. Modern communication systems provide a means for the platforms to report their observations in real time, and the World Wide Web provides a means for wide-spread instantaneous distribution. The use of numerical models to assimilate diverse datasets and forecast forward in time is now well accepted.  The combined use of real-time observations and model forecasts to improve observational efficiency has spawned the new field of adaptive sampling. Emerging partnerships between scientists and engineers from academia, government and private industry are tackling the new developmental challenges that now include autonomous platforms, system integration, and automated response scenarios.
In this paper, we first discuss the current rationale for coastal observing systems, including both real-time and long-term applications.  We then focus on the enabling technologies prompting the rapid proliferation, and the new sensors and platforms available on the near horizon that will improve their capabilities.  We conclude with a common set of problems and limitations, and with recommendations for the future.
 

2. Rationale for Coastal Observation Networks

Permanent, continuously operating, real-time coastal ocean observing networks have applications that tend to fall into two categories, either to support immediate needs or to contribute to a historical database.  Some data and data products are needed in real-time, either for direct use by a variety of coastal decision makers requiring information on present conditions, or as input to nowcast/forecast modeling systems that provide these same decision makers with predictions.  Historical data needs include long observational time series to understand natural variability and separate it from anthropogenic effects, or to provide the framework for other shorter-term observations.
 

2a. Real-time observation and forecasting applications

Forecasting systems are comprised of  observation networks and dynamical models linked by data assimilation schemes. Observation networks can acquire numerous diverse datasets in real-time, but sensors alone cannot continuously sample the full 3-D volume for all variables at the multitude of oceanic time and space scales that exist. Data assimilation schemes can constrain a dynamical model with the real-time observations, enabling the ocean model to produce a hindcast or nowcast in which the observations are interpolated to finer space and time scales.   This requires the observations to first be transformed into the same state variables used by the model, a potentially heavy constraint on the observation networks.  (One example of this is a dynamical model designed to forecast the sub-inertial frequency flow fields cannot directly assimilate ocean current observations that may also include contributions from surface and internal waves, tides, and inertial waves).  The dynamical models also forecast forward in time, and ensemble forecasts can provide estimates of the error fields associated with the predictions. Ensemble forecasts potentially include not only sensitivities to initial conditions, but also sensitivities to predicted boundary forcing or internal model dynamics (Robinson and Glenn, 1999).  In contrast to the deep ocean, coastal forecasts rely heavily on forecast meteorological fields from weather models, several of which may be available.
Real-time observation and forecasting systems have the potential to support numerous activities in the coastal environment, including the following:
 

Safe and efficient navigation and marine operations.

The increasing drafts of oil tankers, cargo and container ships, some of which are restricted from entering and leaving depth-limited U.S. ports to times of high water, illustrate the need for real-time water level data as a more accurate substitute for astronomical tide predictions in areas where wind and river discharge effects are significant.  Real-time current observations in ports, instead of tidal current predictions, are required for critical ship maneuvering situations such as docking, turning, and determining the right of way between two ships approaching each other. Real-time density information for ports with varying river discharge is important for accurate predictions of a ship's static draft.  The maritime community and its customers also need short-term water level forecasts to know how much cargo they can load, or when to leave port, instead of astronomical tide predictions from national tables, which do not include important wind, pressure, and river effects.

 

Efficient oil and hazardous material spill trajectory prediction and clean up.

When a maritime accident leads to a hazardous spill, real-time and forecast current information can provide more accurate predictions of where the spill will be transported so that the most efficient clean up strategy can be initiated.  In this case, 2-D observed or modeled surface current fields become especially important, since they can also be used to define convergence zones where floating materials tend to accumulate. 

Monitoring, predicting and mitigating coastal hazards.

Real-time water level gauges have been used for many years to monitor the growth of storm surge as part of coastal warning systems. Gauges modified to recognize rapid changes in water level have also been part of tsunami warning systems. Real-time water level data are further used to initialize storm surge forecast models, which may involve assimilation over a period of time prior to the forecast period.

 

Military operations.

The strategic objectives of the naval oceanographic community are to provide the environmental information necessary for the safety of day-to-day operations and, if required, to support the warfighter. Safe naval operations anywhere, even along our own coast, depend on local value-added observations to supplement larger scale predictive models. Warfighter support depends on the development of new methodologies for using real-time remote sensing and in situ data for rapid environmental assessment in denied areas. The existing observational and predictive infrastructure available along the U.S. coasts enables the Navy to test new sensors, platforms, models, and sampling techniques in logistically simple situations before deployment in less favorable situations.

 

Search and Rescue.

Search and Rescue (SAR) is one of the Coast Guard's oldest missions.  Approximately 95% of their SAR responses occur within 20 nautical miles of the coast, with 20% lasting longer than 24 hours.  Because of the urgency of SAR, ongoing real-time observations and short-term forecasts for the coastal ocean would help reduce the search time, resulting in more lives saved, reduced costs, and fewer Coast Guard personnel placed at risk.

 

Prediction of harmful algal blooms, hypoxic conditions, and other ecosystem or water quality phenomena.

Physical models, and physical models coupled to water quality or ecosystem models, are beginning to be used for other purposes, where the need for real-time data becomes much broader, in some cases involving other parameters that are still not easily measured in situ or remotely on a continuing basis. Though harmful algal blooms (HABs) are often the result of increased nutrients, physical parameters such as water temperature, salinity, currents, and waves affect stratification, mixing, and transport, and thus can play a role in triggering a bloom, transporting it, or dissipating it.  Forecast model systems similarly could predict the onset of the stratification or the concentration of phytoplankton that can contribute to anoxic conditions in bottom waters. Since currents in a bay can flush out pollutants, stir up and transport sediments (and attached pollutants), and move larvae and juveniles out of and into estuaries, there will be a number of other environmental and ecosystem applications for the nowcast/forecast current fields. 
 

Scientific Research.

Real-time observation networks can now define a 3-D well-sampled research space in which the scientist can operate. If coupled to a numerical model, sampling programs can take further advantage of the additional guidance provided by model generated nowcasts and forecasts. Experiments conducted in a well-sampled ocean are more efficient, since the timing and location of the processes of interest are known. This is especially critical for interdisciplinary adaptive sampling, since many chemical and biological samples are still acquired and analyzed by hand.

 

2b. Needs for Long-term Continuous Consistently-Calibrated Data Time Series

Permanent, continuously operating, coastal observation systems contribute to, and may be part of, the Global Ocean Observing System (GOOS) for climate studies and prediction (e.g., coastal water level gauge networks), living marine resources, and health of the ocean. Shorter-term synoptic data studies (whether done randomly or on a regular basis) cannot be fully understood or correctly utilized without long-term "reference" data times series to compare against. Short-term increases in a particular water quality or ecological parameter may be thought to be solely due to anthropogenic causes, when in fact a longer data record correlated with other long time series of physical oceanographic and meteorological parameters may point to other natural factors. Equally important, long continuous time series allow one to average out higher-frequency variations  that can bias the results of randomly sampled parameters. For these situations there is still a need for real-time data, but primarily for quality control purposes, so that sensor malfunctions can be repaired as quickly as possible to minimize gaps in the data time series. Another important aspect of quality control for these long-term applications is the maintenance of consistent sensor calibrations over the entire record.
The above constraints may apply to sampling programs like NOAA's Status and Trends and EPA's EMAP program. Long-term continuous monitoring of important physical, meteorological, biological, and chemical parameters could lead to more accurate regional and national assessments of trends in water quality, healthy habitats and ecosystems, as well as beach erosion, bathymetric changes, etc., and their connection to anthropogenic causes. Natural changes in flushing due to storms or changing tidal conditions could make water quality problems (due to sewage treatment or non-point source pollution) seem better or worse depending on when randomly sampled data are taken, unless there are long-term, nearly continuous data series for comparison.

3. Enabling Technologies

3a. Rapid expansion of sensors, systems, and platforms

There are many new sensors and systems, which are now available for deployment from a variety of ocean platforms including ships, moorings, bottom tripods, drifters, floats, autonomous underwater vehicles (AUVs), and offshore platforms (e.g., Dickey, 1991; Dickey et al., 1998). The rapid growth of enabling technologies has its origins in partnerships formed between academia, private industry, and government laboratories. In particular, international programs such as the Bermuda Testbed Mooring (BTM; Dickey et al., 1997), MBARI (Chavez et al., 1997) and LEO-15 (Grassle et al., 1998) projects have facilitated both technological and fundamental research including model development. The assortment of in situ platforms is complemented with satellite, aircraft and shore-based remote sensing systems. Altogether, these can be used to sample time and space scales which span about 10 orders of magnitude. The collective sensors and systems can be used to measure many key environmental variables needed to describe and model the physics, chemistry, and biology of the world oceans. Several of these are illustrated for the BTM in Figures 1 and 2.  The BTM has been used by the oceanographic community since mid-1994 (Dickey et al., 1998).  Fundamental breakthroughs, particularly in chemical, optical, and acoustical technologies, enable monitoring of critical parameters which can document both natural and anthropogenically induced changes.
Remote sensing of the physical, and to a more limited degree, biological variability of the upper ocean via satellites and aircraft has stimulated new insights concerning processes of the upper ocean. This technique is increasingly used as a quantitative tool to diagnose and predict the physical and biological states of the upper layer as well. Unfortunately, remote sensing of chemical species is far more difficult and at this point virtually intractable. Further, acquisition of subsurface chemical and biological data from space is even more difficult. Thus, in situ observations remain extremely important for biological and chemical oceanographic problems. Ships have served our community well, however, their limitations in terms of cost, availability, poor synoptic sampling, sample degradation and contamination, etc., have forced utilization of other platforms as well. Several new platforms can now utilize bio-optical, chemical, and acoustic sensors or systems as well as physical measurement devices. The ranges of temporal and spatial scales covered by the various platforms have been well documented (e.g., Dickey 1991). For example, moorings can provide high temporal resolution, long-term measurements; drifters and floats may be used to provide spatial data by effectively following water parcels (Lagrangian); and AUVs can provide excellent spatial data and be programmed for special sampling regimens, even in response mode. Fixed offshore structures and platforms appear to hold great promise for many applications (e.g., HF radars, ADCPs, acoustic systems). Specialized studies will likely continue to need manned submersibles and remotely operated vehicles (ROVs), from which many of the interdisciplinary sensors and systems described here may be deployed. Clearly, several different in situ and remote platforms are necessary to adequately describe and quantify the myriad of ocean processes.
An increasing number of bio-optical, chemical, acoustic and laser sensors and systems are being deployed in situ from ships using towed packages, moorings, bottom tripods, drifters, floats, autonomous underwater vehicles (AUVs), and offshore platforms. Considerations for optical, chemical, and acoustic sensors and systems include response time, size, power requirements, data storage and telemetry, durability, reliability, stability/drift, and susceptibility to biofouling.
The types of in situ physical data that can be collected from these platforms include temperature, salinity, bottom pressure, currents and suspended particle size distributions. Optical measurement capabilities have been expanding very rapidly and now include spectral (multi-wavelength) radiance, irradiance, attenuation, absorption, and fluorescence. These latter measurements are important for determination of phytoplankton absorption characteristics, biomass and productivity (potentially species identification), water clarity and visibility, and sediment resuspension and transport. In addition, optical devices are being used to estimate zooplankton biomass and size distributions, in some cases with species identification capacity. Likewise, major efforts are underway to measure chemical concentrations (e.g., Tokar and Dickey, 1999) with applications such as water pollution (e.g., DDT, PCBs, etc.) and eutrification (dissolved oxygen and nutrients), nutrients (major and trace) for primary productivity, global climate change (carbon dioxide and oxygen), and hydrothermal vents (oxygen, pH, Fe2+, Mn2+, and H2S).  Multi-frequency acoustics are being used to estimate biomass and size distributions of zooplankton.
 


3b. Real-time Communications

The shorter transmission lengths afforded by the scales of the coastal ocean offer a considerably wider range of data transfer modes than are available to observing systems in the open ocean. In some cases, the distances are sufficiently short and the bandwidth requirements sufficiently high that direct connection via fiber-optic or coaxial cable, and delivery of operating power by cable, are warranted. An example of such a system is LEO-15, where power and data are linked directly via buried cables from the offshore observatory to the shore. Such a system can deliver elevated power levels and large transmission bandwidths. Another advantage is the lack of need for a large buoy with power for an onboard radio at the offshore sensor location. The obvious tradeoff for power and bandwidth is the cost of cabling.
UHF and VHF line-of-sight transmission can be considerably less expensive. However, line-of-sight distances may require the construction of a shore tower or rental of space on a commercial tower. In addition to the tower, a shore station with telephone access or a radio relay is necessary for transmission to the central database. Large bandwidths are possible with line-of-sight radio, but power requirements at the sensor location may limit these rates. With UHF and VHF radio, an FCC license is required. Bureaucratic slowness can delay implementation and limit flexibility. The new spread-spectrum radios have the advantages of high bandwidth with no license requirement, but at frequencies that are easily attenuated by vegetation. In this case, line-of-sight is more literal than in the UHF and VHF case.   Single Side Band HF Radio does not require line-of-sight or satellites, but instead bounces its signals off of the ionosphere to ground receiving stations.  Global coverage is currently available, with the most prevalent applications at this time being the delivery of email and weather forecasts to ships at sea.
Cellular phone telemetry (e.g., Dickey et al., 1993) eliminates both licensing and the need to establish shore receiving facilities. Operating costs can be more expensive than line-of-sight radio, even with the new ADCP data-transmission technology. Although a careful survey of users has not been conducted, there have been questions as to the reliability of cellular phone telemetry. Coverage is also an issue, which may become less of a concern with the new but unproven low earth orbit communication satellite systems. Traditional ARGOS and GOES satellite transmissions have proved reliable, although in the case of ARGOS, expensive. An additional advantage of ARGOS is the positioning that it provides, whether for fixed or drifting buoys. For fixed buoys, ARGOS Service provides an alarm for buoys separated from their moorings. A possible disadvantage for GOES satellite service is data delay. For some observational and forecasting needs, a possible 3-hr data delay may be too long.
Local communication among sensors, processors, and data- transmission devices at the site of measurement may incur difficulties, especially for cabling through the water column or along the bottom. Acoustic telemetry has been used effectively for years, although until recently, bandwidths could be severely limiting. Acoustic telemetry is expensive and must be interfaced to underwater sensors. Furthermore, if multiple sensors are deployed at one depth, a customized interface must be developed if multiple telemetry units are to be avoided. In the nearshore or estuary, stratification and topography can create sufficient acoustic multipaths that higher-end telemetry is necessary for successful transmission.

 

3c. Universal Acceptance of the World Wide Web

Everyday we hear of new ways in which the Internet and the World Wide Web have transformed some aspect of our life. We use the World Wide Web at work, in our schools, and at home. As scientists, we communicate our thoughts and results with our colleagues around the world as if they were in the lab next door. The ease of modern communications has fostered the now common formation of nationally distributed science teams for research projects. Ocean observation networks are included on the long list of transformations. Early ocean observatories were controlled in a central location, and real-time information rarely traveled beyond the control facility. Real-time data dissemination required specialized equipment or software not commonly available.
The World Wide Web provides a standardized, instantaneous, and widespread information distribution system. It requires no specialized equipment, only a simple PC and a telephone connection, making it possible to reach not only other scientists, but also the broader educational community and the general public. Availability of a distribution system has prompted the development of automated processing and visualization algorithms to construct real-time products for display on the Web. To this, we add the ability to control sensor systems remotely over the Internet, a capability now being built into modern observation systems. Widespread real-time product distribution and remote control capabilities promote the formation of distributed observation networks, where different groups or agencies are responsible for the individual systems that make up a network. In the near future, fully integrated distributed observation systems communicating over the Internet could use events detected in one part of the network to automatically trigger responses in a different part of the network.
 

4. Future Sensors and Platforms

An increasing number of satellite-based ocean color imagers will likely be available in the next decade. Of special interest, the Navy NEMO Color-Imaging Satellite (COIS) (scheduled launch in Year 2000) will acquire high spectral resolution (~10 nm) measurements with spatial resolution down to 30 m (1 m panchromatic) in selected coastal regions. NEMO, similar future satellites (e.g., EOS AM-1 (MODIS), IRS-P4 (OCM), ADEOS-II (GLI), HY-1 (COCTS)) and aircraft should revolutionize how we observe the coastal ocean's optical and biological properties. Just as the direct-broadcast AVHRR and SeaWiFS data are acquired today by numerous ground stations worldwide, the next generation of high-resolution, hyperspectral satellites will require the proliferation of X-band satellite receiving stations to acquire the real-time full-resolution data. Data recorded on-board these satellites and later downlinked to a central receiving station are both delayed in time and degraded in resolution.
Remote sensing aircraft, both piloted and autonomous, are under-utilized for adaptive sampling.  Existing coastal applications for aircraft include the observation of freshwater plumes using the Microwave Salinity Mapper, and the observation of ocean color with hyperspectal sensors.  An emerging aircraft application is the observation of coastal sea levels with altimeters.   The role of altimeters switches from the observation of sea surface height differences associated with geostrophic currents in deepwater to monitoring tidal elevations in shallow water. Satellite orbits with their long repeat intervals and wide groundtrack spacing are not well suited for the rapid observation of spatially varying tides in shallow water, but an aircraft based altimeter could adaptively sample a critical transect several times over a semi-diurnal tidal period. Two new types of satellite altimeters that are also suitable for aircraft applications are the Delay Doppler Altimeter (DDA) and the bistatic GPS altimeter.  The DDA  (Figure 3) uses a phased array to measure sea surface height, wave height and wind speed to within 1 km of the coast with a 250 m along-track and 1000 m cross-track resolution. The GPS altimeter is based on the observation of GPS signals reflected from the ocean surface.
Currents vary considerably in space, both vertically and horizontally, so that measurements at one point can be inadequate. Currents are greatly affected by bathymetry, which itself can change due to shoaling, dredging, and other causes.  HF-Radars are becoming an accepted technology for synoptic observations of surface current fields. HF-Radar systems are constructed either as phased arrays (two lines of antennas (send and receive) set up along a beach) or direction finding (a broadcast monopole and a cross-loop receiver). Phased arrays originally were single frequency (e.g., OSCR), but new four frequency systems are being used with the ultimate goal of estimating current shear. Dual-frequency microwave radar systems are also being tested for high resolution applications within bays and harbors. The direction finding CODAR HF-Radar systems are configured for several range/resolution combinations. A long-range version is capable of reaching up to 300 km offshore, and a high-resolution version has been tested in San Francisco Bay where every 20 minutes it could generate a 100 m resolution surface current map across a 4 km wide heavy shipping area. Two factors contributing to the ever widening use of HF-Radar systems is the growing number of successful validation studies and the reductions in cost.
Any present HF-Radar system, however, is limited in its ability to observe the near-shore, due to interference by land, and offshore, by the power required to increase the signal to noise ratio. Increasing the number of HF-Radar systems alongshore simply broadens the alongshore coverage by the system spacing, without improving coverage inshore or offshore. CODAR HF-Radar observations can be extended in all directions using a proposed bistatic array in which a second omni-directional transmitter is deployed offshore on a buoy, and the motion sensitive receiver remains on land. The resulting elliptical coordinate system retrieves current speeds along hyperbolas that extend both farther offshore, alongshore, and all the way into the coast (Figure 4). A bistatic HF-Radar system is especially well suited to monitor the flow in and out of inlets, where shore-based systems may only provide estimates of flow across the inlet, and in heavily used ports, where simple transmitters can be placed out of the way on building roofs or bridge tops.
Horizontal acoustic Doppler current profilers (H-ADCPs), when they are sufficiently developed, will be an excellent way to observe high-resolution current fields in real-time. A sufficiently narrow acoustic beam that can reach out far enough from a shore site without bottom or surface effects, can be swept over an area and can thus measure current shears and eddies. Such measurements are not limited to the near surface as with radar. Maintenance of an H-ADCP mounted on a pier or other shore site will be much less expensive than that for an upward looking ADCP installed on the bottom in the middle of a harbor and connected to shore by cable or some other method. That is reason enough to push for faster development of H-ADCPs (which means primarily finding reasonably inexpensive ways to produce narrow beams).
Offshore water level measurement has in the past been accomplished using bottom pressure sensors, but this also required measurements of water density over the water column. Such measurements could not be referenced to any vertical datum. Since they sit on the bottom far from shore, real-time continuous operation and maintenance are extremely difficult and expensive. A better alternative is to use real-time kinematic (RTK) analysis of differential GPS data from buoys to measure water level. One immediate advantage is that the measurements are made relative to a reference datum (the ellipsoid). Real-time communication and maintenance should be simpler and less expensive, and one should be able to take advantage of "buoys of opportunity". Problems presently being addressed include: large power requirements, handling buoy tilt (dues to waves) and buoy draw down (due to currents), and accuracies related to the distance from the nearest continuously operating GPS reference station. Another important application of RTK-GPS, although not for permanent real-time applications, is for the measurement of water levels over an area using GPS on a ship moving in transects across a bay. This has applications for verifying and calibrating numerical hydrodynamic models (which typically have had only data from shore tide gauges) and in support of the hydrographic surveys that obtain depth soundings for nautical charts.
A variety of new in situ platforms have been or are being developed. Most mooring activities utilize instrument packages at fixed depth. However, some measurement programs have utilized moored autonomous profilers. Winching mechanisms, motor driven "wire-crawlers", and programmed buoyancy modification devices are being used for this mode of sampling. An advantage of profilers is that they can provide excellent vertical resolution, however, coarser temporal resolution is a drawback.
Several different autonomous underwater vehicle (AUV) designs are being actively pursued. These range from relatively inexpensive (virtually expendable) AUVs, which could carry moderate payloads of sensors to more elaborate AUVs, which would be capable of carrying larger, more expensive instrumentation. A new class of AUVs are the gliders, which change their buoyancy, and use wings to convert the vertical motion to horizontal (Figure 5). Typical glider horizontal speeds are on the order of 1 knot. In the deep ocean, buoyancy changes are created by phase changes of a material caused by the large temperature differences. In shallow water, the required buoyancy changes are quicker and larger, requiring an electric pump to increase or decrease the size of a buoyancy bladder. Gliders are designed for long duration, low power missions, where a precise path is not required (due to their low speed relative to potential currents). Glider AUVs nicely compliment the short-duration missions of propeller driven AUVs that feature precise navigation and higher power payloads.
New sensors must be developed for particular parameters (especially biochemical) that so far have been difficult to measure in situ or remotely, thus preventing their inclusion in real-time continuously operating systems. These include bacteria, viruses, phytoplankton and zooplankton by amount and species, nutrients, spectral optical properties, contaminating chemicals, etc. There are several emerging optical, chemical, and acoustic sensors and systems that are beginning to be used for these purposes. Many of these are being designed for autonomous deployment. In particular, optical systems are being developed to increase the number of variables (e.g., volume scattering function, spectral excitation and emission parameters, etc.) and the spectral resolution is being improved (e.g., down to 1-2 nm in some cases). Special devices for more directly determining primary productivity (pump and probe type fluorometers) are also becoming available. More capable chemical sensors and systems are likewise increasing the suite of variables which can be measured autonomously (e.g., Tokar and Dickey, 1999). Examples include reagent-based and optically-based systems using colorimetry principals. Fiber optic chemical sensors have been used largely for shipboard measurements, but are also beginning to be used for autonomous systems as well. Microelectromechanical Systems (MEMS) are a relatively new technology, which is used for making, and combining miniaturized mechanical and electronic components out of silicon wafers using micro-machining. MEMS have shown encouraging results for sensing physical parameters, but work is needed to fully realize their full potential for chemical sensing. Most work with MEMS has been done in laboratories, however transitioning to in situ applications seems feasible. Potential advantages of MEMS include: auto-calibration, self-testing, digital compensation, small size, and economical production. Water samplers are also being developed to capture water for measurements of trace metals as well as radiocarbon-based primary productivity. Multi-frequency acoustical systems are becoming more accessible and will likely be improved in terms of spectral resolution and portability. Interoperation of acoustical as well as optical (optical plankton counters) zooplankton records is improving with new image identification capabilities.

5.  Limitations of Present Observation Networks

5a. Support

The technology for coastal observing systems has moved far beyond the feasibility and demonstration phase. A series of national workshops has developed a set of compelling justifications, presenting the promise of coastal ocean nowcasts and forecasts and the value of observing systems to research, monitoring, and education. The delivery of visualized, real-time information from the coastal ocean on the Internet has built support among a broad range of users of this information. Yet despite this apparently rosy state of affairs, the establishment of stable base support for these systems has, with a few exceptions, lagged. This lag is perhaps an expected aspect of the development phase of these programs. Many of the existing programs have been initiated within the research community, with sufficient funding to reach the demonstration stage. At this point, neither a parent agency nor the long-term funding has been identified to make the transition to full operational monitoring and forecasting. Long-term monitoring has always been difficult to sustain because, without long-term records already in hand, their value is not often clear to funding agencies, even when an active, continual analysis system is built into the system. Furthermore, the initiators of these programs naturally stress the vision to establish support, but seldom have these systems evolved to the state where the envisioned coverage or products are at full operational delivery. If this gap between promise and reality is sufficiently large, there is risk of backlash in community support.
It is a classic chicken-or-egg dilemma, where without the transition to operational mode, users cannot depend on the continuity of either monitoring or forecasts, and therefore do not come to depend on either. Even if the transition is made, there is a natural lag time between delivery of a product and the acceptance and expansion of applications within the user community. Continuous, high-frequency information may be delivered to monitoring agencies, but analytical techniques are seldom in place at the outset to incorporate this information into traditional sampling programs.
Part of the difficulty in establishing support for these systems is that the initiators have typically built an observing system funding base as a house of many cards. The system is developed to demonstration stage via an assemblage of many smaller projects supported by a broad range of sources. Even if this assemblage can be made somewhat stable, the transition to operational mode is made difficult because no one agency feels ownership, especially after the fact, when the system shape and identity has been developed without the participation of the agency. Most systems are multipurpose by design, to develop the widest user base for such a committing undertaking. And often, these purposes have very different time scales of interest, ranging from nowcasts and short-term forecasts, to monitoring of long-term ecosystem change. Rationales and justifications are seldom so compelling that they speak uniformly to the entire range of users. Operational funding appears most stable in systems such as the Texas Area Buoy System (TABS) or PORTS where the primary purposes and funding support are singular. TABS is funded by a unified agency for enhancing the ability to respond effectively to oil spills. PORTS was supported initially by NOAA, and then by maritime interests in the local ports.
Regardless of the reasons for the lag between the initiation of observing systems and the availability of operational support, it often leaves the systems in somewhat fragile states. Salaries must be maintained for personnel with a range of expertise, from mooring technicians to electronics technicians, to programmers skilled with visualization and web communication techniques. Ship time for maintenance cruises and instrumentation replacements are other substantial budget items. All too often, these costs are borne by scientific or development grants, which sometimes contribute a disproportionate amount of support to sustain the system. Invariably, the scientists leading the effort are diverted too far from science in search of sustaining funds. An aspect of this mode of funding, depending on the fortuitous overlap of numerous small projects contributing to the whole, is the large fluctuation in support level. This fluctuation leads to inefficiencies and sometimes to the difficulty in retaining skilled and trained personnel.
 

5b. Instrument Calibration

For time series used to study long-term but potentially small environmental trends due to climate change or anthropogenic effects, it is very important that the instrumentation has maintained a consistent calibration over the entire record. Without this consistent calibration there is no way to determine that changes seen in the data are due solely to real-world changes and do not include changes in the sensor calibration. Likewise, differences seen in data from sensors at two different locations must not include differences in the calibration of the two sensors. It will be important to establish consistent calibration methods (and means of accessible documentation) for all sensors in a coastal GOOS. It will also be important to fund groups to assess the calibration of past data times series, that will be used in conjunction with newer data in the determination of long-term trends and similar analyses.
An example where this issue has already been examined is sea level change based on data obtained from tide gauges over the past century. A critical consideration here was maintaining a consistent reference (of the water level data) to the land. For decades this was accomplished in the U.S. by the careful leveling of each tide gauge to bench marks (usually ten) installed in solid rock and other immovable objects. For the float-in-a-well types gauges used for decades, this was actually done by leveling from the bench marks to a tide staff next to the gauge, and having an observer make manual simultaneous observations from the staff to be compared with the tide gauge measurements. The modern acoustic water level gauges now being using by the National Ocean Service (NOS) and other groups allows for direct leveling from the bench marks to the end of the transducer. This brings up the second critical consideration, i.e., the comparison between the old and new method of water level measurement, and any possible difference that might have an effect on long-term trends obtained from analyzing the data time series whose first part came from the old system and whose second part came from the newer system. To minimize this problem, NOS ran the two systems simultaneously at all locations for up to several years. It also studied the possible long-term effects on the new acoustic system, such as temperature effects in the sounding tube.

   

5c. Bio-fouling

Physical and acoustic systems typically have minimal problems in this area as materials like copper-based paints can be utilized.  Unfortunately, these cannot be used with optical and chemical systems.  Conductivity sensors experience biofouling problems and even mechanical current meter rotors have been affected in extreme situations (e.g., barnacles). A large amount of work has been and is being done to find effective means and methods for reducing biofouling effects on optical and chemical sensors and systems. Smooth optical surfaces tend to foul slower than rougher surfaces. Liquid biocides have been found to be relatively effective, notably when allowed to reside inside optical tubes between sampling. Toxic tablets can also be released into these tubes. Darkness is also a good condition for biofouling reduction, so closure of optical (or chemical) sampling volumes is recommended. In the case of profiling devices, keeping sensors at depth between profiles is a good strategy. If chemicals (e.g., bromides) are used with optical systems, degradation of windows through discoloration can be problematic. Copper is a good material for reducing biofouling due to its toxicity for phytoplankton and is presently being used in a variety of ways. For example, copper screens can be used at inlets for flow-through type devices and copper-based shutters can be used for some optical (e.g., radiometers) and chemical (e.g., dissolved oxygen) devices. The body of experience of oceanographers doing autonomous sampling suggests that it is likely that solutions to biofouling may be quite site-specific and even dependent upon time of year and specific oceanic conditions (e.g., El Nino, passages of eddies, etc.).
 

5d. Power

Offshore platforms are generally not limited by power, are very stable, and can be manned. However, they do present major measurement perturbation problems for observations of optical properties dependent on the ambient light field (apparent optical properties) and many chemical measurements because of local contamination. Shipboard sampling is still critical for many measurements which cannot be done autonomously and continue to provide excellent vertical (profile mode) and 3-D spatial (tow-yo) data. However, shiptime is very expensive and ships cannot be used during intense weather and sea-state conditions when often very important processes are occurring. Floats and drifters are often adequately powered for their payloads, but large numbers are generally needed to quantify processes and many optical, acoustic, and chemical sensors remain too expensive to be deployed from such expendable platforms. Moorings and bottom tripods minimize aliasing and undersampling, but are limited to local sampling, their expense restricts use to key selected locations, and their battery life can limit their sampling time.
Power remains a serious limitation for long-term autonomous systems both stationary and moving, either requiring expensive cables, limited solar power, or short-lived batteries. Larger buoys are often powered by solar cells, but most coastal applications require small buoys. Rechargeable batteries can save significant costs, but their power output is limited and recharging batteries inside a closed system can present risks. Higher capacity lithium batteries can provide much more power, but are expensive, are single use, and can be hazardous, making shipping difficult. Fuel cells are now being considered as an alternative power source for long-term autonomous systems, including AUVs.

 

5e. Data Management

As coastal observation systems continue to grow in complexity, management of the rapidly increasing number of diverse datasets and the associated metadata is a recognized concern. Numerous site or even project specific systems with varying degrees of sophistication are being constructed or expanded, but no single system has emerged as the preferred choice for coastal applications. Data management issues should be no more difficult for a coastal GOOS than for other major systems or global projects over the past years, but it will obviously take considerable effort. It is primarily a matter of setting up the automated procedures to bring the data at some interval to the relevant national data centers. These data should have been quality controlled as much as possible prior to being sent to the archives to minimize the efforts of the data centers. The accumulated historical archives should reside in that same national data center. If this is not the case, resources and partners will be needed to find and quality control such historical data. This could conceivably be done on a regional basis with local and national partners helping the national data centers.

 

6. Recommendations

6a. Long-term Support for Long-term Measurements

The funding to ensure the permanent operation of coastal ocean sensors is a difficult problem. Even national systems run by the federal government, such as the National Water Level Observation Network operated by NOS/NOAA and the C-MAN and data buoy network operated by NWS/NOAA, have had funding problems. The operation of new real-time systems, such as NOAA's PORTS, depends on partnerships with state and local agencies. Partnerships will be the cornerstone of a coastal GOOS system, but Congressional funding should be sought to ensure the maintenance and operation of a network of core stations considered most critical for the uses of coastal GOOS. Such funding should include support for coordinated national standards, calibration, maintenance, quality control, and data archiving.

   

6b. Training a New Generation of Science Support Staff

Modern observation networks are being constructed through partnerships between scientists and engineers. Scientists themselves can no longer afford the time to be intimately familiar with the detailed workings of each and every instrument in these networks. There are too many systems to learn and to maintain. A new generation of Master's level science support staff, cross-trained in oceanography and computer science, electronics or engineering, is emerging to fill the gap.  Freed from the distractions of raising their own funding and pursuing academic tenure, these skilled technicians can concentrate on installing, operating and maintaining the numerous new systems presently available or soon to arrive. Their work is facilitated by instrument developers that provide easy to use interfaces that enable their sensors to be reprogrammed, adjusted, recalibrated, error checked, etc, either by a knowledgeable support person, or via a central computer controlling a network. Public outreach to the K-12 community will promote interest in oceanography and responsible stewardship of the oceans, and the new technology will attract more students to the field. We should not make it our sole purpose to turn every graduate student we attract into a new Ph.D. Often it is the Master's level oceanography graduates with strong technical backgrounds that appear to be having the most fun.
 

6c. National Coordination Committee for Linking and Standardizing Observation Systems

The internet and World Wide Web have made possible the linking of individual real-time observation systems maintained and operated by a variety of partners from federal and state agencies, academia, and the private sector. What is needed (beside the funding mentioned above) is a national committee that not only links the various web sites, but also coordinates national standards, calibration techniques, quality control procedures, data formats, website formats (for the data), and other issues that affect the integrated use of the data from all these different sites. Even now there are websites from which a user can be linked to a variety of sites providing real-time data for a particular region, but seeing these data displayed nicely on different websites is different than having easy access to all the data being displayed, so that they can be used in a model or for some other application.

 

 7. Summary

Coastal ocean observation networks are currently operating or are being constructed at numerous locations around the United States and in other nations. The rationale for their construction and maintenance include both long-term and real-time applications. Enabling technologies that now make this possible include the rapid advances in sensor and platform technologies, multiple real-time communication systems for transmitting the data, and the emergence of a universal method for the distribution of results via the World Wide Web.  Future sensors and platforms that will expand the observation capabilities include new ocean color satellites, altimeters, HF Radars, numerous moored instruments and autonomous vehicles.  A common set of limitations each network must address includes operational support, instrument calibration, bio-fouling, power requirements, and data management. Future recommendations include the training of a new generation of computer and field support personnel, and the development of partnerships and long-term support mechanisms to foster the development of national and international distributed observation networks.



Acknowledgements

Scott Glenn is supported by ONR, NOPP, NOAA/NURP and NSF, Bruce Parker by NOAA and NOPP, William Boicourt by NOPP, and Tommy Dickey by ONR, NSF, NASA and NOPP. The authors also thank Michael Crowley for his help in the preparation of this manuscript.


References

Chavez, F.P., J.T. Pennington, R. Herlein, H. Jannasch, G. Thurmond, and G.E. Friedrich, 1997: Moorings and drifters for real-time interdisciplinary oceanography, J. Atmos. and Ocean. Tech., 14, 1199-1211.

Dickey, T., 1991: The emergence of concurrent high-resolution physical and bio-optical measurements in the upper ocean, Reviews of Geophysics, 29, 383-413.

Dickey, T.D., R.H. Douglass, D. Manov, D. Bogucki, P.C. Walker, and P. Petrelis, 1993: An experiment in duplex communication with a multivariable moored system in coastal waters, Journal of Atmospheric and Oceanic Technology, 10, 637-644.

Dickey, T., D. Frye, H. Jannasch, E. Boyle, D. Manov, D. Sigurdson, J. McNeil, M. Stramska, A. Michaels, N. Nelson, D. Siegel, G. Chang, J. Wu, and A. Knap, 1998: Initial results from the Bermuda Testbed Mooring Program, Deep-Sea Res., 771-794.

Grassle, J.F., S.M. Glenn and C. von Alt, 1998: Ocean observing systems for marine habitats. OCC '98 Proceedings, Marine Technology Society, November,  567-570.

Kohut, J.T., S.M. Glenn and D.E. Barrick, 1999:  SeaSonde is integral to coastal flow model development, Hydro International, 3(3), 32-35.

Raney, R.K., 1998: The Delay/Doppler Radar Altimeter, IEEE Transactions Geoscience and Remote Sensing,  36(5), 1578-1588.

Robinson, A.R. and S. M. Glenn, 1999: Adaptive sampling for ocean forecasting, Naval Research Reviews,  51(2),  26-38.

Takar, J. and T. Dickey, 1999: Chemical sensor technology: current and future applications, Chapter 13 in Chemical Sensors in Oceanography, ed. M. Varney, in press.

 

Figure Captions

Figure 1. Schematic diagram the Bermuda Testbed Mooring (BTM) indicating several of the BTM instruments and the data telemetry system (after Dickey et al., 1998).

Figure 2. Illustrations of several bio-optical sampling systems which have been deployed from the BTM.

Figure 3. The Delay-Doppler Altimeter (DDA) is capable of measuring the sea surface elevation to within 1 km of the coast and is proposed for installation on both a constellation of satellites (as shown) and on aircraft platforms (after Raney, 1998).

Figure 4. Normal two-site coverage (blue) for a pair of coastal SeaSonde HF-Radars compared to the additional coverage (orange) available with an offshore pair of bistatic transmitters optimized to resolve the nearshore flows (after Kohut et al., 1999).

Figure 5. The Coastal Electric Glider (shown on the dock so that its wings are visible) being prepared for its first sea-trials at LEO-15.

 

Author Contact Information

Scott M. Glenn
Institute of Marine and Coastal Sciences
Rutgers University
71 Dudley Road
New Brunswick, NJ 08901-8521
732-932-6555 x544
732-932-1821 fax
glenn@caribbean.rutgers.edu
http://marine.rutgers.edu/cool

William Boicourt
Horn Point Environmental Laboratory
University of Maryland
P.O. Box 775
Cambridge, MD 21613
410-221-8426
boicourt@chessie2.hpl.umces.edu

Tommy D. Dickey
Ocean Physics Laboratory
University of California Santa Barbara
6487 Calle Real
Suite A
Goleta, CA 93117
(805) 893-7354
(805) 967-5704 fax
tommy@opl.ucsb.edu
http://www.opl.ucsb.edu

Bruce Parker
Coast Survey Development Laboratory
National Ocean Service, NOAA
N/CS1, SSMC 3, Room 7806
1315 East West Highway
Silver Spring, MD 20910
(301) 713-2801 x 121
(301) 713-4501 fax
Bruce.Parker@noaa.gov