Long-term Real-time Observation Networks
for Ports, Estuaries and the Open Shelf
Scott M. Glenn1, William Boicourt2
Tommy D. Dickey3 and Bruce Parker4
1Institute of Marine and Coastal Sciences
Rutgers University
2Horn Point Environmental Laboratory
University of Maryland
3Ocean Physics Laboratory
University of California Santa Barbara
4Coast Survey Development Laboratory
National Ocean Service, NOAA
July 3, 1999
Abstract
Ocean observation networks for ports, estuaries
and the open shelf are currently operating or are
being constructed at numerous locations around the country.
The rationale for their construction and maintenance
include both long-term and real-time applications.
Enabling technologies that make this possible now
are the rapid advancements in sensor and platform
technologies, multiple real-time communication
systems for transmitting the data, and the emergence
of a universal method for the distribution of
results via the World Wide Web.
Representative observation networks highlighted here
include one for harbors (PORTS),
a second for estuaries (CBOS),
and a third for the open coast (LEO-15).
Each network is described in terms of its system
specific goals, its current capabilities, and its recent
accomplishments.
Future sensors and platforms that will expand the
observation capabilities in all three regions
are described.
A common set of limitations each network must address
includes operational support, instrument
calibration, bio-fouling, power requirements, and data management.
Future recommendations include
the training of a new generation of computer and field
support personnel.
and the development
of partnerships and long-term support mechanisms
to foster
the formation of
a National distributed observation
network.
1. Introduction
Oceanographers are well acquainted with the challenges of
working in an undersampled ocean.
Observations are often sparse, difficult or expensive to acquire,
and may not even be available to the sea-going scientist until
they have physically reached their study site by boat.
This scenario leaves much to chance if the scientist's
interests lie in the study of episodic events which may be
short lived in time and distributed in space.
At the other end of the spectrum, scientists studying long-term trends
such as the coastal and estuarine response to climate and global change,
or the effects of mankind on the coast and estuaries,
must be able to separate natural variability from anthropogenic effects.
This can only be accomplished through the analysis of long-term time
series of key parameters obtained from permanent observation
stations, but such sites are few, and historical sites
may not have maintained the full suite of high-resolution
data now required.
While traditional ocean observing techniques may not be well matched
to the sampling plans necessary to capture
short-lived episodic events or resolve long-term trends,
technological advances in sensors and observation platforms
envisioned in the early 1990's
and brought to fruition later in the decade
have changed our outlook.
Observation networks consisting of remote sensing, stationary,
movable and drifting platforms are being assembled throughout
the country.
Modern communication systems provide a means for the platforms to
report their observations in real time,
and the World Wide Web provides a means for wide-spread instantaneous
distribution.
The use of numerical models to assimilate diverse
datasets and forecast forward in time is now a well
accepted procedure employed by the scientific community.
The combined use of real-time observations and model forecasts
to improve observational efficiency has spawned
the new field of adaptive sampling.
Emerging partnerships between scientists and engineers
from academia, government and private industry are
tackling the new developmental challenges that now include autonomous
platforms (airborne, surface and subsurface), system
integration, and automated response scenarios.
This paper focuses on a description of the current state of modern
observation networks for ports, estuaries and open coasts.
The systems share some common qualities.
They all have long-term goals, but real-time applications are
paying most of the bills. Most focus on the local scale.
They are operated by university researchers, and by government
agencies. They all strive to make their
data accessible over the World Wide Web to the general public as well
as the scientific community.
Initially we mention some applications for these observing systems,
both real-time and long-term, and discuss what enabling technologies
prompted their rapid proliferation.
A list of long-term real-time observation
networks known to the authors is provided in Table 1.
A discussion of the goals, capabilities and accomplishments of
so many established and emerging sites
is beyond the scope of
this paper. Because many sites are local, the goals are often
local. Because the sites are not static, but are constantly
being improved and upgraded, the capabilities and accomplishments
are constantly changing and are often several years ahead of
descriptions and results available in the published literature.
It would be a disservice for the authors to attempt to
accurately portray the current state of an observation
system in which we were not directly involved.
But rather than focus on generalities,
the authors instead choose to highlight the goals,
capabilities and accomplishments of three specific
observation systems
that span the scales from ports, to a large estuary,
and on to the open coast.
Each site is considered representative of the state of the art
for its application region. Each has received support from
the National Ocean Partnership Porgram (NOPP).
We then conclude our description of modern observation networks
with a discussion of the
new sensors and platforms on the near horizon
that will improve capabilities in
all three regions.
While goals, capabilities and accomplishments of the three
observation systems highlighted here are different, the authors were
surprised (and possibly relieved)
to find that we all reported
a common set of problems.
We conclude with a summary of these
difficulties and limitations,
and our recommendations for the future.
2. Rationale for Coastal Observation Networks
Many important activities necessary
to protect our coastal regions can only be
successfully accomplished through the operation
of permanent real-time observation
stations, providing continuous oceanographic data times series,
both for immediate
use in support of coastal decision makers and for analysis
and modeling to understand
the phenomena that affect the coastal region.
There are a few examples of such systems that have existed for a long time,
such as networks of water level (tide) gauges,
which have been operated by most nations for many
decades for a variety of uses that required long time series
(e.g., for analysis
to provide accurate tide predictions including the
18.6 year nodal cycle; for legally
defined marine boundaries based on tidal datums; for
the study of sea level rise
and/or land subsidence;
etc.), and which eventually became national real-time
observation networks
(to provide more accurate water level information for the
navigation community, for improved storm
surge monitoring, and as part of tsunami
warning systems).
However, there are many physical, chemical, biological,
and geological parameters,
for which both long time
series for analysis and real-time access for immediate use
are also needed.
With today's telecommunications and
advancements in remote and in situ sensor technology this is now possible.
The uses for the data from such
permanent continuously operating real-time coastal
ocean observing systems tend to fall into two
categories: (1) data and data products
needed in real-time either for direct
use by a variety of coastal decision makers or
as input into nowcast/forecast
model systems to provide various predictions for those
same decision makers; and (2) long data time series needed
for a variety of analyses
where a long-term framework is
needed (to understand natural variability and to
separate it from anthropogenic effects
and/or to provide the framework for other
shorter-term data sets obtained for a variety of purposes).
2a. Real-time nowcasting and forecasting applications
Ocean prediction systems are comprised of these three basic elements,
observation networks, dynamical models and data assimilation schemes.
Observation networks acquire numerous diverse datasets in real-time,
but sensors alone cannot sample the full 3-d volume for all
variables at the multitude of oceanic time and space scales that exist.
Data assimilation schemes provide the methods for constraining
a dynamical model with the real-time observations, enabling
the ocean model to produce a hindcast or nowcast
in which the observations are interpolated to finer space and
time scales.
For the assimilation scheme to work, the observed data
must first be transformed into the same state variables used by the model.
(As an example, the dynamical model may be forecasting the sub-inertial
flow fields, but the assimilation datasets may include current observations
contaminated by surface and internal waves, tides, and inertial waves,
which first must be removed.)
Once the assimilation step is complete,
the dynamical model can forecast forward
in time, generating future estimates of the 3-dimensional environmental
fields (temperature, salinity, velocity, sea surface height,
sediment concentrations, etc.).
Ensemble forecasts can provide estimates of the error fields associated
with the predictions.
In contrast to the deep ocean,
coastal forecasts also rely heavily on forecast meteorological fields
from weather models.
Real-time nowcasting and forecasting systems have the potential
to support numerous
activities in the coastal environment, including
safe and efficient navigation and marine operations,
efficient oil and hazardous material spill trajectory prediction and clean up,
predicting and mitigating coastal hazards,
military operations,
search and rescue missions,
prediction of harmful algal blooms and hypoxic conditions,
and, not to be forgotten,
scientific research.
Safe and efficient navigation and marine operations.
As mentioned above, the first need
for real-time oceanographic data was for water
levels as a more accurate
substitute for astronomical tide predictions for areas
where wind and river discharge
effects were significant. This became very important
for the commercial maritime community as the drafts of oil tankers, cargo and
container ships became greater and greater, and their entering and leaving
depth-limited U.S. ports became restricted to near the
times of high water. For the
same reasons, the need grew for real-time information on currents
in ports (instead of
tidal current predictions) for critical ship maneuvering
situations such as docking,
turning, and determining the right of
way between two ships approaching each other (given
to the ship moving with the current due to its
having less control). Similarly,
real-time density information for ports with varying
river discharge is important for
accurate predictions of a ship's static draft.
The maritime community and its customers also need short-term water
level forecasts (e.g., to know how much cargo they can load,
or when to leave port, etc.), instead of
astronomical tide predictions from national tables,
which do not include important
wind, pressure, and river effects. Forecasts
of water levels (as well as of currents
and other parameters)
will be provided by model systems that need real-time data to
drive the models, to be assimilated into the models,
and for model verification, as
well as forecast fields from weather forecast models.
Efficient oil and hazardous material
spill trajectory prediction and clean up.
When there is a maritime accident leading to a
hazardous spill, real-time and
forecast currents are important for
accurate prediction of where the spill will be
transported so that the most efficient strategy for clean up
can be accomplished.
Here nowcast and forecast current fields from model systems become especially
important because they
will also be able to show where convergence zones will lead to
the accumulation of oil.
Monitoring, predicting and mitigating
coastal hazards
As mentioned above, real-time water level gauges
have been used for many years to
monitor the growth of storm surge as part of coastal warning systems. Gauges
modified to recognize rapid changes in water level have
also been part of tsunami
warning systems. Real-time water level data are used to initialize
storm surge
forecast models, which may involve assimilation over
a period of time prior to the
forecast period.
Military operations.
The strategic objectives of the Naval oceanographic community
are to provide the environmental information necessary
for the safety of day-to-day operations and, if required,
to support the warfighter.
Safe Naval operations depend on local value-added
observations to supplement larger scale predictive models.
Warfighter support now depends on the development
of new methodologies for
using real-time remote sensing and in situ data
for rapid environmental assessment
as input to tactical decision aids.
Particular emphasis is placed
on the use of Unmanned Autonomous Vehicles for
littoral nowcasting in denied areas.
The existing observational and predictive infrastructure
available along the U.S. coasts
enables the Navy to test
new sensors, platforms, models, and sampling techniques
in logistically simple situations before deployment
in less favorable situations.
Search and Rescue.
Search and Rescue (SAR) is one of the Coast Guard's oldest
missions, with 95% of their SAR responses occurring
within 20 nautical miles of the coast.
While most Coast Guard responses only involve Rescue,
the annual cost for the 10% that do involve a Search
is greater than $50 million.
Approximately 1/5 of the searches last longer than 24 hours.
Because of the urgency of SAR,
ongoing real-time nowcasts
and forecasts for the coastal ocean would help reduce the
search time, resulting in more lives saved, reduced costs,
and fewer Coast Guard personnel placed at risk.
Prediction of harmful algal blooms, hypoxic conditions,
and other ecosystem or water quality phenomena.
Physical models, and physical
models coupled to water quality or ecosystem models,
are beginning to be used for other purposes, where the need for real-time data
becomes much broader, in some cases involving
other parameters that are still not
easily measured in situ or remotely on a continuing basis (see Section 6b).
Although harmful algal blooms (HABs) are often the result of increased
nutrients, physical parameters such as
water temperature, salinity, currents, and
waves affect stratification, mixing,
and transport, and thus can play a role in
triggering a bloom, transporting it, or
dissipating it. 3D baroclinic modeling systems
can be used to nowcast
and forecast such physical conditions. Adding conservation
equations for particular nutrients, or coupling the
physical model to a more complete
water quality or ecosystem model, will ultimately lead to an
HAB forecasting
capability.
Similarly, such forecast model systems could predict
the onset of the stratification or the concentration of phytoplankton
that helps produce anoxic conditions in bottom waters.
And since currents in a bay flush out pollutants,
stir up and transport sediments (and attached pollutants), and
move larvae and juveniles out of and into estuaries,
there will be a number of other
environmental and ecosystem applications for the nowcast/forecast
current fields from
such model systems.
Scientific Research.
Real-time observation networks can now define a 3-d well-sampled
research space in which the scientist can operate.
If coupled to a numerical model, sampling programs can take
further advantage of
the additional guidance provided by model generated nowcasts and forecasts.
Experiments conducted in a well-sampled ocean are more efficient,
since the timing and location of the processes of interest are known.
This is especially critical for interdisciplinary adaptive sampling,
since many chemical and biological samples are still acquired and
analyzed by hand.
An efficient means of locating and timing a sampling program
increases the scientist's effectiveness, since new observations
can be focused on the process of interest at the time it is occurring.
2b. Needs for Long-term Continuous
Consistently-Calibrated Data Time Series
Permanent, continuously operating,
coastal observation systems contribute to, and
may be part of, the Global Ocean Observing System (GOOS)
for climate studies and prediction (e.g.,
coastal water level gauge networks), living marine resources,
and health of the ocean.
Shorter-term synoptic data studies
(whether done randomly or on a regular basis)
cannot be fully understood or correctly
utilized without long-term "reference"
data times series to compare against.
Short-term increases in a particular water quality
or ecological parameter may be thought
to be solely due to anthropogenic causes,
when in fact a longer data series correlated
with various long data series of
physical oceanographic and meteorological parameters
may point to other natural factors.
Equally important, long continuous time series allow one to average out
higher-frequency variations (e.g., the tide, or seasonal effects)
that can bias the results of randomly sampled parameters.
For these situations there is still a need for real-time data, but
primarily for quality control purposes, so that sensor malfunctions can be
repaired as quickly as possible to minimize gaps in the data time series.
Another important aspect of quality control for these long-term applications
is the maintenance of
consistent-calibration of the sensors over the entire time series.
The above constraints
may apply to sampling programs like NOAA's Status
and Trends and EPA's
EMAP program.
Long-term continuous
monitoring of important physical, meteorological, biological,
and chemical
parameters could lead to more accurate regional and national
assessments of trends
in water quality, healthy habitats and ecosystems, as well as beach erosion,
bathymetric changes, etc., and their connection
to anthropogenic causes. Natural
changes in flushing due to storms or changing
tidal conditions could make water
quality problems (due to sewage treatment or
non-point source pollution) seem better
or worse depending on when randomly
sampled data are taken, unless there are long-term, nearly continuous
data series for comparison.
3. Enabling Technologies
3a. Rapid expansion of sensors, systems, and platforms
There are many new sensors and systems, which are now available for
deployment from a variety of ocean platforms including ships, moorings,
bottom tripods, drifters, floats, autonomous underwater vehicles (AUVs),
and offshore platforms
(e.g., Dickey et al. (1993a,b;1998a,b)).
The rapid growth of enabling technologies has its origins in partnerships
formed between academia, private industry, and government laboratories.
In particular, international programs such as the Bermuda Testbed Mooring
(BTM) (Figures 1 & 2), MBARI and LEO-15 projects have facilitated
both technological and fundamental research including model development
(Dickey et al., 1998; Chang et al., 1998; Grassle et al., 1998).
The assortment of in situ platforms is
complemented with satellite and aircraft remote sensing systems.
Altogether, these can be used to sample time and space scales which span
about 10 orders of magnitude. The collective sensors and systems can be
used to measure many key environmental variables needed to describe and
model the physics, chemistry, and biology of the world oceans. Fundamental
breakthroughs, particularly in chemical, optical, and acoustical
technologies enable monitoring of critical parameters which can document
both natural and anthropogenically induced changes.
Remote sensing of the physical, and to a more limited degree, biological
variability of the upper ocean via satellites and aircraft has stimulated
new insights concerning processes of the upper ocean. This technique is
increasingly used as a quantitative tool to diagnose and predict the
physical and biological states of the upper layer as well. Unfortunately,
remote sensing of chemical species is far more difficult and at this point
virtually intractable. Further, acquisition of subsurface chemical and
biological data from space is even more difficult. Thus, in situ
observations remain extremely important for biological and chemical
oceanographic problems. Ships have served our community well, however,
their limitations in terms of cost, availability, poor synoptic sampling,
sample degradation and contamination, etc., have forced utilization of
other platforms as well. Several new platforms can now utilize bio-optical,
chemical, and acoustic sensors or systems as well as physical measurement
devices. The ranges of temporal and spatial scales covered by the various
platforms have been well documented (e.g., Dickey 1991). For example,
moorings can provide high temporal resolution, long-term measurements,
drifters and floats may be used to provide spatial data by effectively
following water parcels (Lagrangian), AUVs can provide excellent spatial
data and can be programmed to do special sampling regimens, even in
response mode. It is worth noting that enabling technologies are
accelerating the utilization of AUVs. Fixed offshore structures and
platforms appear to hold great promise for many applications (e.g., CODAR,
ADCPs, acoustic systems). Specialized studies will likely continue to need
manned submersibles and remotely operated vehicles (ROVs), from which many
of the interdisciplinary sensors and systems described here may be
deployed. Clearly, several different in situ and remote platforms are
necessary to adequately describe and quantify the myriad of ocean
processes.
An increasing number of bio-optical, chemical, acoustic and laser
sensors and
systems are being deployed in situ from ships using towed packages,
moorings, bottom tripods, drifters, floats, autonomous underwater vehicles
(AUVs), and offshore platforms. Considerations for optical, chemical, and
acoustic sensors and systems include response time, drift characteristics,
size, power requirements, data storage and telemetry, durability,
reliability, stability/drift, and susceptibility to biofouling.
The types of in situ physical data, which can be collected from these
platforms include temperature, salinity, bottom pressure, currents and
suspended particle size distributions.
Optical measurement capabilities have been expanding very rapidly and now
include spectral (multi-wavelength) radiance, irradiance, attenuation,
absorption, and fluorescence. These latter measurements are important for
determination of phytoplankton absorption characteristics, biomass and
productivity (potentially species identification), water clarity and
visibility, and sediment resuspension and transport. In addition, optical
devices are being used to estimate zooplankton biomass and size
distributions, in some cases with species identification capacity.
Likewise, major efforts are underway to measure chemical concentrations
with applications such as water pollution (e.g., DDT, PCBs, etc.) and
eutrification (dissolved oxygen and nutrients), nutrients (major and trace)
for primary productivity, global climate change (carbon dioxide and
oxygen), and hydrothermal vents (oxygen, pH, Fe2+, Mn2+, and H2S). Some
other new chemical technologies are described briefly in the
Future Sensors and Platforms section.
Multi-frequency acoustics are being used to
estimate biomass and size distributions of zooplankton. Telemetry of
interdisciplinary data from the various platforms is becoming more common
(see next section).
3b. Real-time Communications
The shorter transmission lengths afforded by the scales of
the coastal ocean offer considerably wider range of data transfer
modes than are available to observing systems in the open ocean.
In some cases, the distances are sufficiently short and the
bandwidth requirements sufficiently high that direct connection
via fiber-optic or coaxial cable and delivery of operating power
by cable are warranted. An example of such a system is LEO-15,
where power and data are linked directly via buried cables from
the offshore observatory to the shore. Such a system can deliver
elevated power levels and large transmission bandwidths.
Another advantage is the lack of need for a large buoy with power
for onboard radio at the offshore sensor location. The obvious
tradeoff for power and bandwidth is the cost of cabling.
UHF and VHF line-of-site transmission can be considerably
less expensive. However, line-of-site distances may require the
construction of a shore tower or rental of space on a commercial
tower. In addition to the tower, a shore station with telephone
access or a radio relay is necessary for transmission to the
central database. Large bandwidths are possible with line-of-sight
radio, but power requirements at the sensor location may
limit these rates. With UHF and VHF radio, an FCC license is
required. Bureaucratic slowness can delay implementation and
limit flexibility. The new spread-spectrum radios have the
advantages of high bandwidth with no license requirement, but at
frequencies that are easily attenuated by vegetation. In this
case, line-of-sight is more literal than in the UHF and VHF case.
Cellular phone telemetry eliminates both licensing and the
need to establish shore receiving facilities. Operating costs
can be more expensive than line-of-sight radio, even with the new
ADCP data-transmission technology. Although a careful survey of
users has not been conducted, there have been questions as to the
reliability of cellular phone telemetry. Coverage is also an
issue, but with the new but unproven satellite-based Iridium system, coverage
may become less a concern. Traditional ARGOS and GOES satellite
transmissions have proved reliable, although in the case of
ARGOS, expensive. An additional advantage of ARGOS is the
positioning that it provides, whether for fixed or drifting
buoys. For fixed buoys, ARGOS Service provides an alarm for
buoys separated from their moorings. A possible disadvantage for
GOES satellite service is data delay. For some forecasting
systems, a possible 3-hr data delay may be too long.
Local communication among sensors, processors, and data-
transmission devices at the site of measurement may incur
difficulties, especially for cabling through the water column or
along the bottom. Acoustic telemetry has been used effectively
for years, although until recently, bandwidths could be severely
limiting. Acoustic telemetry is expensive and must be
interfaced to underwater sensors. Furthermore, if multiple
sensors are deployed at one depth, a customized interface must be
developed if multiple telemetry units are to be avoided. In the
nearshore or estuary, stratification and topography can create
sufficient acoustic multipaths that higher-end telemetry is
necessary for successful transmission.
3c. Universal Acceptance of the World Wide Web
Everyday we hear of new ways in which
the Internet and the World Wide Web have transformed
some aspect of our life.
We use the World Wide Web at work, in our schools,
and at home.
As scientists, we use the Internet and the World Wide
Web to communicate our thoughts and results
with our colleagues around the world as if they were in
the lab next door.
The ease of modern communications
has fostered the now common
formation of nationally distributed science
teams for research projects.
Ocean observation networks are included on the long
list of transformations.
Early ocean observatories were controlled
in a central location, and
real-time information rarely traveled beyond
the control facility. Real-time data dissemination
required specialized equipment or software not commonly available.
The World Wide Web provides a standardized, instantaneous,
and widespread distribution system.
It requires no specialized equipment, only a simple PC and a telephone,
making it possible to reach not only other scientists,
but also the broader educational community and the general public.
The availability of a distribution system has prompted
the development of automated processing and
visualization algorithms to construct real-time products
for display on the Web.
To this, we add the ability to control sensor systems
remotely over the Internet, a capability
now being built into modern observation systems.
Widespread real-time product distribution and remote
control capabilities promote the formation
of distributed observation networks,
where different
groups or agencies are responsible for the individual systems
that make up a network.
In the near future, fully integrated distributed
observation systems communicating over the Internet could use
events detected in one set of sensors to trigger responses
in another set of sensors located in a different part of the network.
4. Three Coastal Observation Networks
4a. An Observation Network for Harbors - PORTS
Setting
The Physical Oceanographic Real-Time System
(PORTS) is a centralized data acquisition
and dissemination system that provides real-time
observations (updated every 6
minutes) of water levels, currents, water temperature and salinity,
wind speed and
direction, and atmospheric pressure from numerous
locations around a bay or harbor.
Nowcasts and 24-hour forecasts of these parameters
from numerical oceanographic
models driven by real-time data and forecast
meteorological fields from weather
models are also being implemented.
PORTS systems were designed and installed by NOAA's
National Ocean Service (NOS) and
are operated in partnership with the local
marine community for each bay or harbor.
Full PORTS systems are presently operating in the Tampa Bay (5 locations, 15
instruments), the Port of New York and New Jersey
(4 locations, 14 instruments), San
Francisco Bay (9 locations, 38 instruments), Galveston Bay (5 locations, 26
instruments), and Chesapeake Bay (5 locations,
22 instruments), with plans to install
similar systems in several other ports around the U.S.
Systems with only a single
water level gauge and several
meteorological instruments ("PORTS Lite") are installed
at Anchorage and Nikiski, Alaska, and Seattle and Tacoma, Washington.
In addition,
individual stations of the National Water Level
Observation Network (NWLON) are also
accessible in real-time (but are updated every 3 hours).
Quasi-operational nowcast/forecast model systems
developed by the Coast Survey Development
Laboratory in NOS are
running daily (with output on restricted Websites) for three PORTS
locations: Chesapeake Bay, The Port of New York and New Jersey; and
Galveston Bay. A USGS developed model is being run in a similar
capacity for San Francisco Bay.
The University of
South Florida is developing a nowcast/forecast model system for Tampa
Bay.
PORTS was originally implemented to serve the commercial
navigation community, whose
large deep-draft ships required more accurate water
level information (than could be
provided by astronomical Tide Tables) in order to safely enter and leave
depth-limited U.S. ports. However, these data have many
non-navigational uses and
are finding a growing user community.
Goals
The primary goals of the PORTS program are
to: promote navigation safety, improve
the efficiency of U.S. ports and harbors, and ensure the
protection of coastal marine
resources (Figures 3 and 4).
PORTS (used in combination with nautical charts and
GPS) provides ship
masters and pilots with accurate real-time information required to avoid
groundings
and collisions. PORTS installations in U.S. harbors have the potential to
save the
maritime insurance industry from
multi-million dollar claims resulting from shipping
accidents. Access to accurate real-time water level information
and 24-hour
forecasts allows U.S. port
authorities and maritime shippers to make sound decisions
regarding loading of
tonnage (based on available bottom clearance),
maximizing loads and limiting passage
times without compromising safety.
PORTS is important to environmental protection,
since marine accidents can lead to hazardous
material spills that can destroy a bay's
ecosystem and the tourism, fishing, and other industries that depend on it.
real-time and forecast
circulation information from
PORTS is also used to better predict the movement of
hazardous spills from marine accidents, thus making cleanup more efficient.
Capabilities
PORTS uses some of the latest
developments in telecommunication and oceanographic
sensor technology.
Data and information from a PORTS can
be accessed via a number of methods including:
(1) via
internet website [www.co-ops.nos.noaa.gov], which provides various graphical
displays of not only the latest values but also of
data from the previous three days;
(2) via touch tone phone
(including cellular phone) dial up to a voice data response
system (that translates the most recent data
into words); and (3) via phone dial up
with computer and modem (to obtain text screen of
the latest values). PORTS data
will also be pulled into vessel traffic services
(VTS) systems, "smart bridge"
systems on commercial ships, and Electronic
Chart and Display Information Systems
(ECDIS).
Data transmission
from the remote data collection sites to the PORTS Data Acquisition
System (DAS)
(Figure 5)
is accomplished in several ways:
(1) utilizing high-speed dedicated data
lines (T-1), including those used by the
U.S. Coast Guard (asynchronous data
interfaces are used to interface PORTS equipment with T-1 line
multiplexors.); (2)
line-of-site radio modems as well as wire line communication, which allow
point-to-point applications with simple
3-wire RD232 interface connections; (3)
standard telephone lines installed at each water level station to allow
administrative communication and backup transmission
to the DAS. The DAS receives
remote data, determines data type, initiates
the appropriate program for that data
type, performs quality control tasks, archives the
data, and formats the data for
output. Data transmission from each PORTS DAS to NOS
headquarters in Silver Spring,
Maryland (where the Web pages are maintained) is over an
intranet using the MCI
Network. A Continuously Operational Real-time Monitoring
System (CORMS)
(Figure 6)
has been
developed to provide a
national centralized quality control system, which determines
data quality, evaluates system
performance, identifies invalid or suspect data to
users, and provides information needed by
maintenance crews to repair PORTS systems.
Water levels at each location are
measured using a downward-looking air acoustic
sensor, which is referenced to ten tidal bench
marks, which in turn are referenced
via GPS to the National Spatial Reference System.
An acoustic pulse is sent through
a sounding tube from the transducer down to the water
surface and back. The two-way
travel time is measured for the reflected signal, from both the water
surface and a
calibration point in the tube. The calibration
signal provides the sensor with a
means of correcting each water level
measurement for variations in air speed in the
column due to changes in temperature or humidity.
Six-minute values are obtained
from one-second sampled data. Backup
water level measurements are made at each
location using an electronic pressure transducer integrated into
a dry-purge
"bubbler" system, where nitrogen gas is
regulated to slowly purge through an open
orifice mounted below the
water surface; the transducer senses pressure change in the
system as the water level changes. In addition to these data being
disseminated via
the PORTS system, they are also
transmitted via GOES satellite every 3 hours to NOS
headquarters for archiving and further quality control.
Vertical profiles of currents are measured using acoustic
Doppler current profiler
(ADCP) systems. Each ADCP is generally placed
on the bottom in an upward-looking
configuration using a low-profile platform.
Water temperature, conductivity, wind speed
and direction, wind gusts, atmospheric
pressure, and air temperature are regularly measured with a variety of the
off-the-shelf sensors, with other sensors such as
visibility and rainfall added when
required.
The nowcast/forecast model system being incorporated into each PORTS
relies on real-time and forecast information from other sources besides
the real-time information from each PORTS installation (described
briefly above), including meteorological data and forecasts from NWS and
other sources, and hydrological data from USGS
(Figure 7).
(A map-based prototype
Website has been implemented for the Chesapeake Bay area to provide
other users with a central source for all these real-time data.)
Communication mechanisms for obtaining these data for the models
presently rely heavily on the Internet, but will eventually include
NWS's NOAAPORT and other mechanisms. The Chesapeake Bay nowcast/forecast
model system is presently the most elaborate of the four model systems.
Forecasts rely on forecast entrance boundary conditions provided by a
coastal forecast model for the U.S. East Coast, which is driven by
forecast fields of winds, pressure, and other meteorological parameters
from an NWS weather model, as well as wind fields over the Bay from a
high-resolution mesoscale weather model, whose boundary conditions come
from the same large scale weather model and which is initialized by
meteorological fields from LAPS (Local Analysis and Prediction System).
Nowcasts, updated hourly, are driven with real-time oceanographic data
and meteorological fields from LAPS, with data assimilation techniques
being developed to improve the model prediction skill. These nowcasts
provide the initial conditions for 24-hour forecast runs.
Accomplishments
Real-time information from 5 full PORTS systems,
4 PORTS Lite systems, and dozens of
NWLON systems are presently used
every day to insure safe navigation of U.S.
waterways, especially by oil tankers,
cargo ship, and container ships. On several
occasions real-time currents from PORTS have been used to help predict the
trajectories of oil spills.
The first stages of the CORMS quality control system is
operational. An Ocean Systems Test & Evaluation Facility (OSTEF) is being
established for evaluating new
instrument technology and for developing and applying
oceanographic measurement quality assurance (QA) processes.
Nowcast/forecast models
systems have been developed for four PORTS
locations and are running daily in a
quasi-operational mode. These model systems are driven by real-time data and
forecasts from weather models and a coastal ocean forecast system.
A NOPP-funded
project is underway to improve the technical
skill of the Chesapeake Bay and East
Coast model systems and to integrate a variety of real-time data and forecast
products for evaluation by a dozen user groups.
This project includes seven NOAA partners (in NOS, NWS, OAR), the
University of Maryland, Princeton University, the University of Rhode
Island, TASC, Inc., WSI, Inc., and the Navy.
4b. An Observation Network for Estuaries and
Coastal Embayments - CBOS
Setting
Estuaries, with their strong inputs from both land and sea
into confined basins, resemble continuous reactors, where fresh
water and ocean waters mix, and where nutrients derived from the
land are efficiently converted to harvestable resources. These
productive reactors are vulnerable to accelerating additional
uses man has made of these water bodies, including maritime commerce,
recreation, and the disposal of wastes. With the increase in the
population living near the coast has come the urban estuary the
Hudson-Raritan, Baltimore Harbor, Tampa Bay, Mobile Bay,
Galveston Bay, San Francisco Bay, and Puget Sound. Of primary
concern for many estuaries is the introduction of excess
nutrients from point sources (municipal sewers) and from diffuse
sources such as runoff from agricultural lands, with the
resulting overenrichment and environmental degradation.
Unfortunately, in the face of accelerating stresses, the ability
to assess trends in the health of estuaries has been woefully
inadequate, and almost always, retrospective. Furthermore, as
scientists reveal more complexity in the myriad interacting
components of these systems, the task of detecting trends, much
less predicting future conditions, has seemed increasingly
daunting. We are learning that man's impacts do not always appear
as sudden, obvious jumps in easily detected signals, but are
typically subtle, creeping changes in sometimes unexpected
indicators, slowly manifest over many decades. An example is the
summertime depletion of oxygen in the lower layers of Chesapeake
Bay or in the offing of the Mississippi River, which is likely
the result of increasing nutrient inputs from land. Detection of
these trends has been made difficult, not only by the sparseness
of historical records, but also by the masking of these low,
slowly varying signals by the noise of large shorter-term natural
variations in the ecosystem. Although the effect of regular,
short-term fluctuations can be isolated from the continuum, more
worrisome are sporadic events such as phytoplankton blooms,
river-flow surges, floods, and major storms that can profoundly
shift the estuarine ecosystem for decades.
As the recognition of environmental degradation of estuaries
has emerged, shipboard-based monitoring programs have been
instituted to guide and assess proposed actions to restore and
protect these valuable resources. But both efforts labor under
the challenge that the actions promise success with sufficient
certainty to warrant society's expenditure of sometimes painfully
large resources toward these ends. Unfortunately, shipboard
surveys seldom resolve either the higher-frequency fluctuations
or the rapid shifts in the ecosystem. Furthermore, they seldom
measure the circulation of water, despite the strong influence
that it exerts on the biology of the estuary.
Goals
The first estuarine real-time monitoring system was designed
to address these deficiencies in the largest U.S. estuary, the
Chesapeake Bay, which extends 200 miles seaward from the
Susquehanna River mouth at the northern end, to the Virginia
Capes which form the Bay's entrance. The Chesapeake Bay
Observing System (CBOS) was inaugurated in 1989 by the University
of Maryland Center for Environmental Science. From the outset,
it was intended as a cooperative program among academic,
governmental, military, industrial, and environmental partners.
The initial goals were aimed at both ends of the variability
spectrum, but were also related: (a) to examine long-term ecosystem change,
and (b) to aid short-term process research.
Long-term records provide information on not only the slowly
varying component of coastal ocean processes, but also provide
many realizations of episodic and high-frequency fluctuations.
These realizations help researchers develop a more accurate
description of these processes, and ultimately improve the
detectability and understanding of long-term ecosystem change. A
major technique enabling this improvement is the reduction in
signal-to-noise ratio by extracting the higher-frequency process
signals from the lower-level, slowly varying signal.
Even with these scientific and environmental goals, there
was a recognition at the outset of CBOS that an expensive and
committing observing system could benefit from a wider purpose.
The off-the-shelf feasibility of real-time communication opened
the door for a variety of uses that are more dependent on rapid
return of information than scientific analysis. One of these
uses is maritime commerce. The dominant subtidal variability in
water level and currents in Chesapeake Bay is caused by a
quarter-wave seiche, with period of approximately 2 days, and
with an amplitude of up to 1 m. The astronomical tide in
Baltimore Harbor is of the order 0.5 m, so that these variations
can create significant uncertainty in navigation of deep-draft
vessels entering the port, where below-keel clearances are often
minimal in the dredged channels. In the early days of CBOS the
expectation was that, with an accurate numerical model of the
circulation of the Bay, real-time information on winds, tides,
and currents in the Bay could be assimilated along with wind
predictions to provide forecasts of water level in the Port of
Baltimore. Although tides and currents are important for this
determination, the primary need from CBOS is real-time
information on winds over the Bay, which often differ
significantly in both magnitude and direction from winds over
land.
Another use for a model fed with real-time data from CBOS is
the forecasting of oil-spill trajectories. Chesapeake Bay's
enclosed geometry not only enhances biological productivity, but
it also renders its productivity especially vulnerable to
hazardous material spills. With the improved technology for oil-spill
containment and remediation, real-time information from
CBOS could be crucial in directing resources during a spill.
Even prior to the delivery of improved marine forecasts, real-time
information on over-water winds and sea conditions are also
of significant value for boating and fishing, which are of
substantial economic value to the region. Commercial charter
boats routinely check CBOS information prior to leaving port for
the day's fishing activity.
One aspect of real-time systems that is harder to document
but has real consequences is the excitement of fresh information
in both science and education. Scientific ideas are often
generated when new data are at hand, and before speculation is
fettered by more sober analysis. The excitement of real-time
information is a substantial asset in education, whether in the
K-12 or university classroom, or in the education of the public
at large, who need to be sufficiently engaged to support the
large costs of restoring the Bay. The education process is
helped by having teaching materials available online for
manipulation, visualization, and interpretation of the
information.
Capabilities
As originally envisioned, the Chesapeake Bay Observing
System was designed as a series of 6-8 moored platforms arrayed
down the axis of the Bay. As is typical of estuaries, the
strong inputs of fresh water and salt combined with the Bay's
topography create regional structures in circulation, property
distributions, and biological resources that require a minimum
number of platforms to properly represent conditions along its
200 mile axis. The intent was to maintain these platforms as
Permanent Monitoring Stations, providing continuous information
throughout the year and far into the future. To complement this
permanent array, a series of rapidly deployable Rover Buoys was
planned to provide higher-resolution information in regions of
topical interest for shorter time scales. Rover Buoys could be
deployed in response to events such as fish spawning, oil spills,
harmful algal blooms, or process research to augment the larger
Monitoring Station array. The first Permanent Monitoring
Stations were launched in 1989 in the northern and middle reaches
of Chesapeake Bay (Figure 8). The first Rover Buoys were
launched in the Patuxent River, a western shore tributary, in
1993. An additional Permanent Monitoring Station was added in
1998.
The large nutrient inputs to the Bay create both high
productivity and a severe depletion of oxygen in the Bay's lower
depths during summer. Biofouling and anoxia degrade all
underwater sensors, but especially chemical and optical sensors,
requiring short service intervals during the summer. However,
even during the winter, underwater sensors must be turned around
within 6 weeks. Such a service schedule would be prohibitively
expensive if the large Monitoring Station buoys were replaced at
this frequency because they require a larger, more expensive
vessel that has sufficiently heavy deck gear to handle the buoy
and mooring tackle. Instead, buoys are deployed for a year and
underwater sensors are mounted on a separate taut-wire mooring
with subsurface floatation, and the data are relayed to the
surface buoy by acoustic telemetry. These adjacent moorings can
then be serviced with smaller, less-expensive vessels.
The size and design of the Monitoring Station buoys has
evolved, with the early buoys having discus, or surface-following
hulls designed for wave measurements. More recent buoys (Figure 9)
have more a more hemispherical hull shape, with longer
instrument wells and a 1-ton counterweight. These hulls are more
a heave-buoy design, with the intent of stiffening the rolling
moment to improve meteorological and optical measurements.
Hulls are Surlyn ionomer foam, which has proved durable and
protective of the onboard processors. The two original hulls
have been used for 10 years, and are still deployed annually.
At the outset of CBOS, UHF and VHF radio linked line-of-
sight to shore stations was chosen for telemetry. Satellite
links were significantly more expensive, and sometimes
encountered substantial delays in data transmission. When
cellular phone coverage became broad enough to cover the Bay, and
the new and less costly data-packet cellular technology was
introduced, this method of communication was considered.
Uncertainty in the reliability of cellular-phone telemetry in the
Bay region has led to the postponement of this conversion. In
the meantime, the need for higher bandwidth for additional
sensors led to the incorporation of Spread-Spectrum radios for
two new CBOS buoys.
Once the data are received at shore stations, they are
transmitted via the Internet to a central server at UMCES Horn
Point Laboratory in Cambridge, MD for processing and
visualization, and then delivered to the public by the Web (Figure 10).
A real-time data-base engine called AutoMate was built to handle
the entire procedure, from acquisition through visualization and
downloadable archiving on the Web.
In 1999, an additional Permanent Monitoring Station and
three additional Rover Buoys will be added to CBOS. As these
buoys have come online, an effort has been made to expand the
sensor suite and obtain full vertical profiles of currents,
temperature, and salinity. Oxygen, chlorophyll, nutrient, and
turbidity sensors have been deployed for research and testing
over seasonal time scales. Various biofouling reduction
techniques have been explored with the aim of extending
deployments to sufficient length and to eventually move these
sensors to operational status. Optical sensors for incoming
irradiance and water-leaving radiance have been outfitted on a
stationary tower and CBOS buoy to develop techniques for
obtaining continuous measurements of ocean color and chlorophyll
in support of aircraft and satellite overflights.
Accomplishments
In the early days of CBOS, system development required
sufficient focus that occasionally the primary goals of both
long-term ecosystem change and short-term process research were
neglected in the fray. Over the last few years, as more science
programs have participated in CBOS and come to depend on the
system to provide a temporal and physical context for shorter-
term research, the initial design has begun to come to fruition.
As scientific papers and graduate theses using CBOS data have
been produced, a wider audience has considered the system as a
resource for research on the Bay. The National Science
Foundation Land Margin Ecosystem Research Program on Chesapeake
Bay has relied heavily on CBOS data. In addition, new scientific
programs have taken advantage of CBOS platforms and data
telemetry to install new sensors for biology and chemistry.
Furthermore, additional sensors have been added to provide input
to operational models of sound propagation for military
installations, for which artillery concussions and sonic booms
are the primary environmental problems. The NOAA Air Resources
Laboratory have installed sensors for monitoring the atmospheric
deposition of nutrients, which are the primary pollutant entering
Chesapeake Bay. The NOAA Center for Coastal Ecosystem Health is
considering establishing a Sensor Testbed Facility on the
Chesapeake Bay, and CBOS would provide platforms and telemetry
infrastructure to aid in this effort.
A National Ocean Partnership Program award in 1998, involves
a partnership among the University of Maryland, Princeton
University, NOAA National Ocean Service, the U.S. Navy, and TASC,
Inc. to produce forecasts of winds and water levels over the
Chesapeake Bay region. Real-time CBOS data will be assimilated
in a numerical model to improve these forecasts. As wave sensors
are added to the CBOS suite, a real-time wave forecasting model
will be put in place. Eventually, an oil spill model
assimilating CBOS data will be constructed to guide containment
and cleanup efforts.
To realize the promise of CBOS for education, K-12 teachers
have been incorporated into the program as summer fellows,
developing teaching materials and activity modules. With these
aids, science teachers will be able to have their students access
CBOS online, and then download and analyze the data for a variety
of scientific lessons.
4c. An Observation Network for the Open Coast - LEO-15
Setting
The Rutgers University Long-term Ecosystem Observatory (LEO-15)
is an instrumented natural littoral laboratory located offshore
Tuckerton, New Jersey.
According to Brink (1997) at the NSF sponsored APROPOS Workshop,
"shelf waters deeper than about 3 m and shallower than about 30 m have often
been ignored in the past because of the very difficult operating conditions
and the complex dynamics, where the water is effectively filled with
turbulent boundary layers".
LEO is designed to span the 3 m to 30 m water depths
with an approximately 30 km x 30 km well-sampled
research space (Figure 11).
The LEO observation
network includes multiple remote sensing, shipboard, autonomous
and moored sensor systems that surround a pair of instrument
platforms or nodes secured to the ocean floor.
Goals
Specific goals for the LEO-15 nodes are (Grassle et al., 1998):
1) continuous observations at frequencies from seconds to decades,
2) spatial scales of measurement from millimeters to kilometers,
3) practically unlimited power and broad bandwidth, two-way transmission
of data and commands,
4) an ability to operate during storms,
5) an ability to plug in any type of new sensor, including cameras,
acoustic imaging systems, and chemical sensors and to operate them
over the Internet,
6) bottom-mounted winches cycling instruments up and down in the water,
either automatically or on command,
7) docking stations for a new generation of autonomous
(robotic) underwater vehicles (AUVs)
to download data and repower batteries,
8) an ability to assimilate node data into models and make three-dimensional
forecasts for the oceanic environment,
9) means for making the data available in real-time to schools
and the public over the Internet, and
10) low cost relative to the cost of building and maintaining manned
above- and below-water systems.
General goals for the LEO observation network include:
1) the construction of a distributed observation network
using modern remote sensing, in situ
and meteorological instrumentation,
2) an ability to process, visualize and combine diverse datasets in real-time
to generate data-based nowcasts of the 3-dimensional ocean structure
at selected times,
3) the development of a new coastal ocean circulation model with
new turbulence closure schemes and improved boundary conditions
obtained through coupling to atmospheric models, large scale ocean models,
and surface wave models.
4) the ability to
assimilate multi-variate datasets into the ocean model
in real-time to generate
forecasts of the 3-dimensional ocean structure
at selected times,
5) the development of new adaptive sampling techniques
that use the nowcasts and forecasts to guide
ship-towed and autonomous underwater vehicle
sampling for interdisciplinary applications,
6) the development of an open access database management system for
wide-spread distribution of LEO data, and
7) to provide scientists a user-friendly data-rich environment in which to
conduct focused research experiments.
Capabilities
The two LEO nodes where installed on the ocean floor in 1996
about 10 km offshore in about 15 m of water.
A buried electro-fiber optic cable links the nodes
to the Rutgers University
Marine Field Station (RUMFS), which provides access to the Internet.
The cable provides continuous power for instrumentation,
and bi-directional communication and video links over three optical fibers.
To allow for periodic servicing,
the complete electronics/mechanical package from each node is recoverable
by boat.
Except during the busy summer season when demand is high,
one node is often out of the water being serviced or upgraded
while the other node maintains the long-term dataset.
Each node is equipped with an internal winch that moves a
profiler vertically through the water column.
The winch can be controlled by an onshore computer to
automatically profile at specified intervals,
or it can be manually controlled, either directly from the RUMFS shore base,
or remotely over the Internet. The profiling
package is typically equipped with pressure, temperature, conductivity,
optical back scatter, light, chlorophyll, and oxygen sensors.
The nodes are further equipped with several bottom-mounted systems,
including a pressure sensor (for waves, tides and storm surge),
an ADCP (for current profiles), a hydrophone, a fixed video camera,
and a pan-and-tilt video camera.
In addition, 8 guest ports provide power and
Internet communications
to additional sensors deployed by other investigators.
Guest sensors have typically included tripods equipped with current meters,
sediment size distribution sensors and fluorometers
for resuspension and transport studies.
An autonomous underwater vehicle (AUV) docking port was installed on one
of the LEO nodes for July 1998.
On numerous occasions during this initial test phase,
a Remote Environmental Measuring UnitS (REMUS) AUV (von Alt et al., 1997)
successfully docked, recharged its batteries, and was redeployed with a
new mission profile downloaded from the shorebase over the fiber-optic cable.
System upgrades for the summer of 1999
include the installation of a third optical node
on the sea bottom attached to the same fiber-optic cable.
The optical node also will contain a winch operated profiler,
but the profiling package will provide data on inherent optical
properties, particle size distributions, and fluoresence.
The network of observation systems surrounding the LEO nodes
include satellite, aircraft and shore-based remote sensing
systems to provide broad spatial coverage of surface properties,
meteorological systems to provide forcing information,
autonomous nodes to spatially extend the permanent LEO nodes
during selected periods, and
multiple shipboard and AUV systems for subsurface adaptive sampling.
Satellite datasets include real-time sea-surface temperature and
ocean color derived from locally-acquired
direct broadcast transmissions from the AVHRR and SeaWiFS sensors,
delay mode surface roughness data acquired from RADARSAT
through NOAA, and delay mode hyperspectral
data from the NEMO satellite scheduled for launch in 2000.
Surface current data are updated hourly
by a pair of CODAR HF-Radar stations located on the barrier
islands to the north and south of LEO.
Local meteorological data currently are collected on a 64-meter tower
located at the RUMFS. Upgrades for 1999 include deployment of
a weather/optics buoy offshore, and installation of a
atmospheric profiler onshore.
A single line of 6 autonomous nodes was deployed on a cross-shelf
line during the summer of 1998 to act as a navigation network for
the REMUS AUVs. RF-modem communications via a repeater located
at the top of the meteorological tower allowed real-time tracking
of the AUV survey missions.
Twelve autonomous nodes will be redeployed along 2 cross-shelf
lines in 1999, with each node further equipped with
8 thermistors. Adding real-time communication capabilities with ADCPs
is a planned upgrade for 2000.
Two coastal research vessels are equipped for physical and bio-optical
subsurface adaptive sampling.
The physical survey vessel tows a Small Water Area Twin Hull
(SWATH) vehicle with an ADCP off the starboard side,
and an undulating vehicle with a CTD/OBS/Fluorometer system
off the stern. A shipboard local area network with a RF Ethernet
bridge to shore is used to display and transmit
the high resolution physical data to shore as it is collected.
The bio-optical survey vessel is equipped with multiple profilers
for apparent optical property systems
inherent optical property systems. An RF Ethernet bridge is
used on this vessel to access and display the numerous real-time datasets
to guide scientists deciding where and when to stop the boat
for profiling.
Three types of REMUS AUVs were used operationally at LEO in 1998.
The REMUS Docking Vehicle equipped with a docking nose
successfully completed numerous docking tests.
A REMUS Survey Vehicle equipped with upward/downward looking ADCPs
and a CTD completed 15 cross-shelf survey sections, including
a 60 km, 12 hour duration mission.
A REMUS Turbulence Vehicle further equipped with fast response
CTDs, shear probes and thermistors completed 4 missions
to observe turbulent fluctuations at the millimeter scale.
The Webb Glider AUVs equipped with CTD/Fluorometers will be added in 1999.
By cycling their buoyancy between positive and negative, the Gliders
can fly in a sawtooth pattern, collecting upwards of 200 CTD casts per day
for several weeks.
The Glider is designed to patrol the offshore boundary, and at regular
intervals, fly to within RF-Modem range, upload its data
via the RF-repeater on the meteorological tower, then download
a new mission profile for the next interval.
Accomplishments
Extensive infrastructure and an open data policy
have fostered broad participation
in LEO-15 research projects by the scientific community.
Over 60 researchers from over 25 institutions
are currently funded for LEO-15 related research projects.
NOPP partners include Woods Hole Oceanographic Institution,
Naval Undersea Warfare Center, CODAR Ocean Sensors,
RD Instruments, Webb Research Corporation and the US Geological Survey.
The largest research programs are associated with
studies of coastal upwelling and its interdisciplinary
implications in the summer, and sediment transport in the fall.
Figures 12 and 13
illustrate typical monitoring and adaptive sampling
data acquired during the summer
1998 coastal upwelling experiments.
Figure 12 (top) shows the yearly cycle of warming and cooling
observed in the 1998 bottom temperatures collected
by the LEO-15 nodes.
The largest variations in the seasonal cycle are caused by the
summertime upwelling events, such as the one entering a relaxation
phase on July 23.
The surface current and temperature nowcast for July 23
(Figure 12, bottom)
indicated that the upwelling jet was meandering
around a cyclonic eddy embedded within the cold upwelling center.
This data-based nowcast, the model forecast for continued
upwelling, and sensitivity runs showing a dependence on turbulent
closure on the cold side of the front,
were used to define three cross-shelf sampling transects.
A ship-towed SWATH ADCP and an undulating CTD/Fluorometer (Creed et al., 1998)
were sent to patrol
the transect just north of the eddy center,
and a REMUS
survey vehicle was sent to patrol the transect just south.
The REMUS turbulence vehicle was sent directly into the eddy center
to observe the changing turbulence characteristics
as the vehicle drove out of the eddy and crossed the upwelling front.
The alongshore current component
(Figure 13a, color contours)
acquired by the REMUS survey vehicles not only
indicates that the northward-flowing upwelling jet on the offshore
side is confined to the upper water column,
it also reveals a southward-flowing, subsurface jet on the nearshore side.
The systems towed along the northern transect uncovered a similar
velocity structure (Figure 13b).
The offshore jet was confined to the warm water
above the thermocline (Figure 13c),
and the nearshore jet was found within the cold water
of the upwelling center.
The corresponding fluorometer section (Figure 13d) indicates that
the highest phytoplankton concentrations of the season were located
within the subsurface jet, leading to the hypothesis that phytoplankton
concentration increases within the upwelling center may
be dominated by advection from the north.
The above example illustrates how adaptive sampling strategies
are transformed in a well-sampled ocean.
When spatially-extensive, rapidly-updated, real-time data is available,
forecasters can compare the developing trends
in their model generated forecast with the developing trends in the
observations to see where the model is staying ontrack,
and where it is drifting offtrack.
Adaptive sampling is no longer guided solely by the model results,
but instead by data-based nowcasts
and model-generated forecasts.
The goal of adaptive sampling also changes.
In under-sampled regions, errors in model
generated forecasts are usually dominated by errors in
an under-resolved initial condition.
In a well-sampled region, errors in the ocean forecast
may instead be dominated by
imperfect model physics, such
as unparameterized turbulent mixing mechanisms.
Instead of focusing on improving model initializations,
adaptive systems can shift their focus to sampling regions
where the physics is poorly understood and the results
are sensitive to changes in their numerical parameterizations.
The adaptive sampling data sets can be used for model verification,
to help improve model physics, or for assimilation,
to help keep the model on track despite the imperfect physics.
Observations like those illustrated above are displayed on the LEO
Website in real-time. The datasets also can be downloaded from the Website
using the Rutgers Ocean Data Access Network (RODAN).
Access to the LEO Website has been continuously tracked since 1995.
One measure that can be unambiguously defined over this long
time period is the number of discrete files
(html page, gif image, etc.) the web server
sends out to a users browser.
The number of hits, by this definition, has a yearly cycle that peaks in
the summer and doubles each year.
The 1998 maximum reached 33,000 hits per day.
Over 70% of the Web hits are from commercial Internet service providers,
as opposed to government and educational institutions.
One of the most important users outside of the research community is
the Project Tomorrow K-12 educational outreach program.
Through Project Tomorrow,
thousands of teachers have been introduced to LEO-15,
with over 600 participating in training sessions lasting up to a week.
Over 45 teachers have participated in the design of Web based
lesson plans that use the LEO-15 data.
This year, over 12,000 students will be using the LEO Website
through the Marine Activities Resources & Education (MARE) program.
LEO-15 has emerged as a valuable validation site for new instrumentation,
in particular AUVs and remote sensing systems.
The first operational missions for the REMUS Docking, Survey
and Turbulence Vehicles were conducted at LEO during the summer of 1998.
The Webb Coastal Electric Glider will undergo its first
field trials at LEO during the summer of 1999.
LEO was chosen as a NOAA site for the validation of RADARSAT
surface roughness imagery, and is one of three Navy
validation sites for the hyperspectral
NEMO satellite scheduled for launch in 2000.
Several aircraft
are scheduled for overflights,
including two hyperspectral sensors (AVIRIS and PHYLS)
and the microwave salinity mapper (SLFMR).
The two proposed aircraft altimeters (D2P, Bistatic GPS)
and the proposed floating bistatic CODAR HF-Radar systems
(Kohut et al., 1999)
are requesting to use LEO as their first test site.
5. Future Sensors and Platforms
An increasing number of ocean color imagers will likely be available in the
next decade. Of special interest, the Navy NEMO COlor-Imaging Satellite (COIS)
(scheduled launch in Year 2000) will acquire high spectral resolution (~1
nm) measurements with spatial resolution down to 30 m (1 m panchromatic) in
selected coastal regions. NEMO,
similar future satellites
(EOS AM-1 (MODIS),
IRS-P4 (OCM), ADEOS-II (GLI), HY-1 (COCTS))
and
aircraft should revolutionize how we observe the coastal ocean's optical
and biological properties.
Just as the direct-broadcast AVHRR and SeaWiFS data are acquired today
by hundreds of ground stations worldwide,
the next generation of high-resolution, hyperspectral satellites
will require the proliferation of X-band satellite dishes to
acquire the real-time full-resolution data. Data recorded on-board these
satellites and later downlinked to a central receiving station
is both delayed in time and degraded in resolution.
Offshore water level measurement has in the
past been accomplished using bottom
pressure sensors, which also required
measurements of water density over the water
column. Such measurements could not be
referenced to any vertical datum. Since they
sit on the bottom far from shore,
real-time continuous operation and maintenance are
extremely difficult and expensive. A better alternative is to use real-time
kinematic (RTK) GPS on a buoy to
measure water level. One immediate advantage is
that the measurements are made relative
to a reference datum (the ellipsoid).
Real-time communication and maintenance should
be simpler and less expensive, and one
should be able to take advantage of
"buoys of opportunity". Problems presently being
worked on include: large power requirements,
handling buoy tilt (dues to waves) and buoy draw down (due to currents),
and accuracies related to the distance from the
nearest continuously operating GPS reference station.
Another important application
of RTK-GPS, although not for permanent real-time applications,
is for the measurement
of water levels over an area using
GPS on a ship moving in transects across the bay.
This has applications for verifying and
calibrating numerical hydrodynamic models
(which typically have had only data from shore tide gauges)
and in support of the
hydrographic surveys that obtain depth soundings for nautical charts.
Remote sensing aircraft, both piloted and autonomous, are under-utilized
for adaptive sampling.
At present, there are no known aircraft providing real-time remote sensing
data to coastal observation networks.
An especially noteworthy aircraft application is the observation of
coastal sea levels.
The role of altimeters switches from the observation of sea surface
height differences
associated with geostrophic currents in deepwater to monitoring tidal
elevations in shallow water. Satellite orbits with their long repeat
intervals and wide groundtrack spacing are not well suited for the
rapid observation of spatially
varying tides in shallow water, but an aircraft based altimeter
could adaptively sample a critical transect several times over
a semi-diurnal tidal period.
Two types of aircraft altimeters are being designed and built.
The Delay Doppler Phase Monopulse (D2P) Altimeter uses a phased array
to measure sea surface height, wave height and wind speed to within 1 km
of the coast with a 250 m along-track and 1000 m cross-track resolution.
The bistatic GPS altimeter is based on the observation of GPS signals
reflected from the ocean surface.
New sensors that can provide
synoptic coverage with reasonable resolution may
also be important for various applications
(oil spill movement, pollutant transport,
larval transport, etc.). Currents vary considerably in space,
both vertically and
horizontally, so that measurements at one point in space can be inadequate.
Currents
are greatly affected by bathymetry, which itself can change due to shoaling,
dredging, and other causes.
HF-Radars are now an accepted technology for synoptic observations
of surface current fields.
HF-Radar systems are constructed either as phased arrays
(two lines of antennas (send and receive) set up along a beach)
or direction finding (a broadcast monopole and a cross-loop receiver).
Phased arrays originally were single frequency (ex. OSCR), but new
four frequency systems are being used with the ultimate goal of estimating
current shear.
Dual-frequency microwave
radar systems are also being tested for high resolution
applications within bays and harbors.
The direction finding
CODAR HF-Radar systems are configured for several range/resolution
combinations.
A long-range version demonstrated at Scripps in 1999
is capable of reaching up to 300 km offshore.
A high-resolution version is currently deployed
in San Francisco Bay where every 20 minutes it can generate a 100 m
resolution surface current map across a 4 km wide heavy shipping area.
Two factors contributing to the ever widening use of
HF-Radar systems is the growing number of successful validation studies,
and the reductions in cost through mass production.
For example, a single CODAR site now only costs about 1/3 more
than a single ADCP cost 5 years ago.
Any present HF-Radar system, however, is limited in its ability to
observe the near-shore, due to interfence by land, and offshore,
by the power required to increase the signal to noise ratio.
Increasing the number of HF-Radar systems alongshore simply broadens
the alongshore coverage by the system spacing,
without improving coverage inshore or offshore.
CODAR HF-Radar observations can be extended in all directions
using a proposed bistatic array
in which a second omni-directional
transmitter is deployed offshore on a buoy, and the motion sensitive
receiver remains on land. The resulting elliptical coordinate system
retrieves current speeds along hyperbolas that extend both farther
offshore, alongshore, and all the way into the coast.
A bistatic HF-Radar system is especially well suited to monitor the flow
in and out of inlets, where shore based systems may only provide
estimates of flow across the inlet,
and in heavily used ports, where simple
transmitters can be placed out of the way on building roofs or bridge tops.
Horizontal acoustic Doppler current profilers (H-ADCPs),
when they are sufficiently developed, will be an excellent
way to observe high-resolution current fields in
real-time. A sufficiently narrow acoustic beam
that can reach out far enough from a
shore site without bottom or surface effects,
can be swept over an area and can thus
measure current shears and eddies.
Such measurements are not limited to the near
surface as with radar. Maintenance of an
H-ADCP mounted on a pier or other shore
site will be much less expensive than that
for an upward looking ADCP installed on
the bottom in the middle of a harbor and
connected to shore by cable or some other
method. That is reason enough to push for
faster development of H-ADCPs (which means
primarily finding reasonably inexpensive ways to produce narrow beams).
A variety of new in situ platforms have been or are being developed.
Most mooring activities utilize
instrument packages at fixed depth. However, some measurement programs have
utilized moored autonomous profilers. Winching mechanisms, motor driven
"wire-crawlers", and programmed buoyancy modification devices are being
used for this mode of sampling. An advantage of profilers is that they can
provide excellent vertical resolution, however, coarser temporal resolution
is a drawback.
Autonomous underwater vehicles (AUVs) were mentioned earlier. It should be
noted that there are several different AUV designs, which are being
actively pursued. These range from relatively inexpensive (virtually
expendable) AUVs, which could carry moderate payloads of sensors to more
elaborate AUVs, which would be capable of carrying larger, more expensive
instrumentation.
A new class of AUVs are the gliders, which change their buoyancy,
and use wings to convert the vertical motion to horizontal.
Typical glider horizontal speeds are on the order of 1 knot.
In the deep ocean, buoyancy changes are created by phase changes of
a material caused by the large temperature differences.
In shallow water, the required buoyancy changes are quicker and larger,
requiring an electric pump to increase or decrease the size of a buoyancy
bladder. The first prototype Coastal Electric Glider will carry CTDs
and Fluorometers. Gliders are designed for long duration, low power
missions, where a precise path is not required (due to their low speed
relative to potential currents).
Glider AUVs nicely compliment the short-duration missions of
propeller driven AUVs that feature precise navigation and
higher power payloads.
New sensors must be developed for particular parameters
(especially biochemical) that
so far have been difficult to measure
in situ or remotely, thus preventing their
inclusion in real-time continuously operating systems.
These include bacteria, viruses,
phytoplankton and zooplankton by amount and species,
nutrients, spectral optical properties, contaminating chemicals, etc.
There are several emerging optical, chemical, and acoustic sensors and
systems that are beginning to be used for these purposes.
Many of these are being designed for autonomous deployment. In
particular, optical systems are being developed to increase the number of
variables (e.g., volume scattering function, spectral excitation and
emission parameters, etc.) and the spectral resolution is being improved
(e.g., down to 1-2 nm in some cases). Special devices for more directly
determining primary productivity (pump and probe type fluorometers) are
also becoming available. More capable chemical sensors and systems are
likewise increasing the suite of variables, which can be measured
autonomously. Examples include reagent-based and optically-based systems
using colorimetry principals. Fiber optic chemical sensors have been used
largely for shipboard measurements, but are also beginning to be used for
autonomous systems as well. Microelectromechanical Systems (MEMS) are a
relatively new technology, which is used for making, and combining
miniaturized mechanical and electronic components
out of silicon wafers using micro-machining. MEMS have shown encouraging
results for sensing physical parameters, but work is needed to fully
realize their full potential for chemical sensing. Most work with MEMS has
been done in laboratories, however transitioning to in situ applications
seems feasible. Potential advantages of MEMS include: auto-calibration,
self-testing, digital compensation, small size, and economical production.
Water samplers are also being developed to capture water for measurements
of trace metals as well as radiocarbon-based primary productivity.
Multi-frequency acoustical systems are becoming more accessible and will
likely be improved in terms of spectral resolution and portability.
Interoperation of acoustical as well as optical (optical plankton counters)
zooplankton records is improving with new image identification
capabilities.
6. Difficulties and Limitations of Present
Observation Networks
6a. Support
The technology for coastal observing systems has moved far
beyond the feasibility and demonstration phase. A series of
national workshops has developed a set of compelling
justifications, presenting the promise of coastal ocean nowcasts
and forecasts and the value of observing systems to research,
monitoring, and education. The delivery of visualized, real-time
information from the coastal ocean on the Internet has built
support among a broad range of users of this information. Yet
despite this apparently rosy state of affairs, the establishment
of stable base support for these systems has, with a few
exceptions, lagged. This lag is perhaps an expected aspect of
the development phase of these programs. Many of the existing
programs have been initiated within the research community, with
sufficient funding to reach the demonstration stage. At this
point, neither a parent agency nor the long-term funding has been
identified to make the transition to full operational monitoring
and forecasting. Long-term monitoring has always been
difficult to sustain because, without long-term records already
in hand, their value is not often clear to funding agencies, even
when an active, continual analysis system is built into the
system. Furthermore, the initiators of these programs naturally
stress the vision to establish support, but seldom have these
systems evolved to the state where the envisioned coverage or
products are at full operational delivery. If this gap between
promise and reality is sufficiently large, there is risk of
backlash in community support.
It is a classic chicken-or-egg dilemma, where without the
transition to operational mode, users can not depend on the
continuity of either monitoring or forecasts, and therefore do
not come to depend on either. Even if the transition is made,
there is a natural lag time between delivery of a product and the
acceptance and expansion of applications within the user
community. Continuous, high-frequency information may be
delivered to monitoring agencies, but analytical techniques are
seldom in place at the outset to incorporate this information
into traditional sampling programs.
Part of the difficulty in establishing support for these
systems is that the initiators have typically built an observing
system funding base as a house of many cards. The system is
developed to demonstration stage via an assemblage of many
smaller projects supported by a broad range of sources. Even if
this assemblage can be made somewhat stable, the transition to
operational mode is made difficult because no one agency feels
ownership, especially after the fact, when the system shape and
identity has been developed without the participation of the
agency. Most systems are multipurpose by design, to develop the
widest user base for such a committing undertaking. And often,
these purposes have very different time scales of interest,
ranging from nowcasts and short-term forecasts, to monitoring of
long-term ecosystem change. Rationales and justifications are
seldom so compelling that they speak uniformly to the entire
range of users. Operational funding appears most stable in
systems such as the Texas Area Buoy System (TABS) or PORTS where
the primary purposes and funding support are singular. TABS is
funded by a unified agency for enhancing the ability to respond
effectively to oil spills. PORTS is supported initially by NOAA,
then by maritime interests in the local ports.
Regardless of the reasons for the lag between the initiation
of observing systems, it often leaves the systems in somewhat
fragile state. Salaries must be maintained for personnel with a
range of expertise, from mooring technicians to electronics
technicians, to programmers skilled with visualization and web
communication techniques. Ship time for maintenance cruises and
instrumentation replacements are other substantial budget items.
All too often, these costs are borne by scientific or development
grants, which sometimes contribute a disproportionate amount of
support to sustain the system. Also too often, the scientists
leading the effort are diverted too far from science in search of
sustaining funds. An aspect of this mode of funding, depending
on the fortuitous overlap of numerous small projects contributing
to the whole, is the large fluctuation in support level. This
fluctuation leads to inefficiencies and sometimes to the
difficulty in retaining skilled and trained personnel.
6b. Instrument Calibration
For series used to study long-term (often small) environmental
changes due to climate change or anthropogenic effects,
it is very important that the
instrumentation has maintained a consistent calibration over the entire time
series. Without this consistent calibration
there is no way to know that changes
seen in the data are due solely to real-world
changes and do not include changes in
the sensor calibration. Likewise, differences seen in
data from sensors at two
different locations must not include
differences in the calibration of the two
sensors. It will be important to establish
consistent calibration methods (and means
of accessible documentation)
for all sensors in a coastal GOOS. It will also be
important to fund groups to assess the calibration
of past data times series, that
will be used in conjunction with newer data
in the determination of long-term trends
and similar analyses.
An example where this issue has already been examined
at is sea level change based on
data obtained from tide gauges over the past century.
A critical consideration here
was maintaining a consistent reference
(of the water level data) to the land. For
decades this was accomplished in the U.S.
by the careful leveling of each tide gauge
to bench marks (usually ten) installed in solid rock
and other immovable objects.
For the float-in-a-well types gauges used for decades,
this was actually done by
leveling from the bench marks to a tide staff next to the gauge,
and having an
observer make manual simultaneous observations from the
staff to be compared with the
tide gauge measurements.
The modern acoustic water level gauges now being using by
NOS and other groups allows for direct leveling from the bench marks
to the end of
the transducer. This brings up the second critical consideration, i.e., the
comparison between the old and new method of water level measurement, and any
possible difference that might have an effect on
long-term trends obtained from
analyzing the data series whose first part
came from the old system and whose second
part came from the newer system.
To minimize this problem, NOS ran the two systems
simultaneously at all locations for up to several years. It also studied the
possible long-term effects on the new acoustic system,
such as temperature effects in
the sounding tube.
6c. Bio-fouling
Physical and acoustic systems typically have minimal problems in this area
as compared with optical and chemical systems and materials like
copper-based paints can be utilized. However, conductivity sensors
experience this problem and even mechanical current meter rotors have been
affected in extreme situations (e.g., barnacles). A large amount of work
has been and is being done to find effective means and methods for reducing
biofouling effects on optical chemical sensors and systems.
Smooth optical surfaces tend to foul slower than rougher surfaces. Liquid
biocides have been found to be relatively effective, notably when allowed
to reside inside optical tubes between sampling. Toxic tablets can also be
released into these tubes. Darkness is also a good condition for
biofouling reduction, so closure of optical (or chemical) sampling volumes
is recommended. In the case of profiling devices, keeping sensors at depth
between profiles is a good strategy. If chemicals (e.g., bromides) are used
with optical systems, degradation of windows through discoloration can be
problematic. Copper is a good material for reducing biofouling due to its
toxicity for phytoplankton and is presently being used in a variety of
ways. For example, copper screens can be used at inlets for flow-through
type devices and copper-based shutters can be used for some optical (e.g.,
radiometers) and chemical (dissolved oxygen) devices. The body of
experience of oceanographers doing autonomous sampling suggests that it is
likely that solutions to biofouling may be quite site-specific and even
dependent upon time of year and specific oceanic conditions (e.g., El Nino,
passages of eddies, etc.).
6d. Power
Offshore platforms are generally not limited by power, are
very stable, and can be manned. However, they do present major measurement
perturbation problems for observations of optical properties dependent on
the ambient light field (apparent optical properties) and many chemical
measurements because of local contamination.
Shipboard sampling
is still critical for many measurements which cannot be done autonomously
and continue to provide excellent vertical (profile mode) and 3-D spatial
(tow-yo) data. However, shiptime is very expensive and ships cannot be
used during intense weather and sea-state conditions when often very
important processes are occurring.
Floats and drifters are often adequately powered for their payloads,
but large numbers are generally needed to quantify processes and
many optical, acoustic, and chemical sensors remain too expensive to be
deployed from such expendable platforms.
Moorings and bottom tripods
minimize aliasing and undersampling, but are limited to local sampling,
their expense restricts use to key selected locations, and their
battery life limits their sampling time.
Power remains a serious limitation for long-term autonomous systems
both stationary and moving,
either requiring expensive cables, limited solar power, or
short-lived batteries.
Larger buoys are often powered by solar cells, but most coastal
applications require small buoys.
Rechargeable batteries can save significant costs, but their power output
is limited and recharging batteries inside a closed system
runs the risk of explosion.
Higher capacity lithium batteries can provide much more power,
but are expensive and can explode on their own, making shipping hazardous.
Fuel cells are now being considered as an alternative power source for
long-term autonomous systems, including AUVs.
6e. Data Management
As coastal observation systems continue to grow in complexity,
management of the rapidly increasing number
of diverse datasets and the associated metadata is a recognized concern.
Numerous site or even project
specific systems with varying degrees of sophistication are being
constructed or expanded, but no single system has emerged as
the preferred choice for coastal applications.
Data management issues should be no trickier for a coastal
GOOS than for other major
systems or global projects over the past years, but it will obviously take
considerable effort. It is primarily a matter of
setting up the automated procedures
to bring the data at some interval to the relevant national data centers.
These data
should have been quality controlled as much as
possible prior to being sent to the
archives to minimize the efforts of the data centers.
The accumulated historical
archives should reside in that same national data center.
If this is not the
case, resources and partners will be needed to find and quality control such
historical data. This could conceivably
be done on a regional basis with local and
national partners helping the national data centers.
7. Recommendations
7a. Long-term Support for Long-term Measurements
The funding to ensure the permanent operation of coastal ocean
sensors is a difficult
problem. Even national systems run by the federal government,
such as the National Water Level Observation Network
operated by NOS/NOAA and the C-MAN and data buoy
network operated by NWS/NOAA, have had funding problems. The operation of new
real-time systems, such as NOAA's PORTS, depends on partnerships with state
and local
agencies. Partnerships will be the cornerstone of a coastal GOOS system,
but Congressional funding should be sought to ensure
the maintenance and operation of a network of core stations considered
most critical for the uses of coastal GOOS. Such
funding should include support for coordinated national standards,
calibration,
maintenance, quality control, and data archiving.
7b. Training a New Generation of Science Support Staff
The observation networks discussed here were built through partnerships between
scientists and engineers.
Scientists themselves can no longer afford the time to be intimately
familiar with the detailed workings of each and every instrument
in the observation networks. There are too many systems
to learn and to maintain.
A new generation of Master's level science
support staff, cross-trained in oceanography and computer science, electronics
or engineering, is emerging to fill the gap.
The super-techs, as they have been called,
are freed from the distractions of raising their support,
and can then concentrate on installing, operating and
maintaining the numerous new systems presently available
or soon to arrive.
Their work is facilitated by instrument developers that provide
easy to use interfaces to their instruments so they
can be reprogrammed, adjusted, recalibrated,
error checked, etc, either by a
knowledgeable support person, or via a central computer controlling a network.
Public outreach to the K-12 community will promote interest
in oceanography, and the new technology will attract more students to the field.
However, we should not make it our sole purpose to turn
every graduate student we attract into a new Ph.D.
Often it is the Master's level oceanography graduates
with strong technical backgrounds that appear to be
having the most fun.
7c. National Coordination Committee for
Linking and Standardizing Observation Systems
The internet and World Wide Web have made possible the linking of individual
real-time observation systems maintained and operated by a variety of
partners from federal and state agencies, academia, and the private sector.
What is needed (beside
the funding mentioned above) is some type of national committee
that not only links the various web sites,
but also coordinates national standards, calibration
techniques, quality control procedures, data formats, website formats
(for the data),
and other issues that affect the integrated use of the data from all
these different
sites. Even now there are websites from which a user
can be linked to a variety of sites providing real-time
data for a particular region, but seeing these data
displayed nicely on different websites is different
than having easy access to all the
data being displayed, so that they can be used in a model or for some other
application.
Acknowledgements
Scott Glenn is supported by ONR, NOPP and NOAA/NURP,
Bruce Parker by NOAA and NOPP,
William Boicourt by NOPP, and
Tommy Dickey by ONR, NSF, NASA and NOPP.
The authors also thank Michael Crowley for his
help in the preparation of this manuscript.
References
Appell, G.F., T.N. Mero, T.D. Bethem, and G.W. French, 1994. "The
Development of a Real-time Port Information System." IEEE Journal of
Oceanic Engineering, 19(2): 149-157.
Brink, K.H., 1997.
Observational coastal oceanography,
Advances and Primary Research Opportunities in Physical Oceanography
Studies (APROPOS) Workshop, NSF-sponsored workshop on the
Future of Physical Oceanography, 15-17 December, 1997.
http://www.joss.ucar.edu/joss_psg/project/oce_workshop/apropos/presentations/brink.html
Chavez, F.P., J.T. Pennington, R. Herlein, H. Jannasch, G. Thurmond,
and G.E. Friedrich, 1997. Moorings and drifters for real-time
interdisciplinary oceanography, J. Atmos. and Ocean. Tech., 14, 1199-1211.
Creed, E.L., S.M. Glenn and R. Chant, 1998.
Adaptive Sampling Experiment at LEO-15.
OCC '98 Proceedings, Marine Technology Society, November, pp. 576-579.
Dickey, T., 1991, The emergence of concurrent high-resolution physical and
bio-optical measurements in the upper ocean, Reviews of Geophysics, 29,
383-413.
Dickey, T.D., R.H. Douglass, D. Manov, D. Bogucki, P.C. Walker, and P.
Petrelis, 1993, An experiment in duplex communication with a multivariable
moored system in coastal waters, Journal of Atmospheric and Oceanic
Technology, 10, 637-644.
Dickey, T., D. Frye, H. Jannasch, E. Boyle, D. Manov, D. Sigurdson, J. McNeil,
M. Stramska, A. Michaels, N. Nelson, D. Siegel, G. Chang, J. Wa, and A. Knap,
1998a. Initial results from the Beruda Testbed Mooring Program,
Deep-Sea Res., 771-794.
Dickey, T., A. Plueddemann, and R. Weller, 1998b, Current and water
property measurements in the coastal ocean, The Sea, eds. A. Robinson and
K. Brink, in press.
Glenn, S.M., D.B. Haidvogel, O.M.E. Schofield,
J.F. Grassle, C.J. von Alt, E.R. Levine and D.C. Webb, 1998.
Coastal Predictive Skill Experiments at the LEO-15
National Littoral Laboratory.
Sea Technology, April, pp. 63-69.
Grassle, J.F., S.M. Glenn and C. von Alt, 1998.
Ocean Observing Systems for Marine Habitats.
OCC '98 Proceedings, Marine Technology Society, November, pp. 567-570.
Kohut, J.T., S.M. Glenn and D.E. Barrick, 1999.
OCC '98 Proceedings, Sea Technology, November, pp. 567-570.
SeaSonde is Integral to Coastal Flow Model Development,
Hydro International, April, pp. 32-35.
Parker, B.B., 1996. "Monitoring and Modeling of Coastal Waters in
Support of Environmental Preservation," Journal of Marine Science and
Technology, 1(2):75-84.
Parker, B.B., 1998. "Nowcast/Forecast Model Systems for Bays and
Harbors: The Need and the Technical Challenges," Proceedings, Ocean
Community Conference 1998, The Marine Technology Society, pages 224-229.
Parker, B.B. and L.C. Huff, 1998. "Modern Under-Keel Clearance
Management". International Hydrographic Review, LXXXV(2), Monaco,
September 1998.
SCOR, 1993, GLOBEC Report No. 3, 1993, Sampling and Observing Systems,
GLOBEC International, Chesapeake Biological Laboratory, Solomons, MD, ed.
T. Dickey, 99pp.
U.S. JGOFS Planning Report No. 18, 1993, Bio-optics in U.S. JGOFS, eds. T.
Dickey and D. Siegel, U.S. JGOFS Planning and Coordination Office, Woods
Hole Oceanographic Institution, Woods Hole, MA 02543, 180pp.
von Alt, C.J., M.P. De Luca, S.M. Glenn, J.F. Grassle and D.B. Haidvogel, 1997.
LEO-15: Monitoring & Managing Coastal Resources.
Sea Technology, 38, (8), pp. 10-16.
Author Contact Information
Scott M. Glenn
Institute of Marine and Coastal Sciences
Rutgers University
71 Dudley Road
New Brunswick, NJ 08901-8521
732-932-6555 x544
732-932-1821 fax
glenn@caribbean.rutgers.edu
http://marine.rutgers.edu/cool
William Boicourt
Horn Point Environmental Laboratory
University of Maryland
P.O. Box 775
Cambridge, MD 21613
410-221-8426
boicourt@chessie2.hpl.umces.edu
Tommy D. Dickey
Ocean Physics Laboratory
University of California Santa Barbara
6487 Calle Real
Suite A
Goleta, CA 93117
(805) 893-7354
(805) 967-5704 fax
tommy@icess.ucsb.edu
http://www.icess.ucsb.edu/~tom
Bruce Parker
Coast Survey Development Laboratory
National Ocean Service, NOAA
N/CS1, SSMC 3, Room 7806
1315 East West Highway
Silver Spring, MD 20910
(301) 713-2801 x 121
(301) 713-4501 fax
Bruce.Parker@noaa.gov