top of page

PHYSICS & OCEANOGRAPHY

Marine physical data characterize the currents, waves, tides, and other dynamic processes that govern the motion of water and the ocean.

Data Collection

For over a century, the development of instrumentation for measuring oceanic conditions has primarily been motivated by maritime commerce and conflict; in 2022, the global maritime industry transported about 80-90% of all international trade. On the West Florida Shelf, Port Tampa Bay is the largest maritime seaport in Florida and its activities represent several billion dollars of commerce annually and support an estimated 130,000 local jobs. Safe and efficient maritime commerce, missing boater search and rescue operations, and modeling of inundation by hurricane storm surges and waves, among other applications, all depend on the ability to specify currents, sea level, and sea state. The multi-faceted goals of protecting the vital economic engine of maritime shipping and managing marine resources have been important motivators for the development of reliable regular monitoring of oceanographic conditions, including the West Florida Shelf and its seafloor environment.


The quantities pressure (P), temperature (T), and salinity (S) influence many essential environmental conditions of the West Florida Shelf, as well as the operation of scientific instruments themselves that are critical for understanding seafloor conditions in the region. Specifically, these quantities impact chemical and biological processes in the ocean; near-bottom currents can resuspend sediments and other settled materials and transport small and planktonic biota. Currents can also transport nutrients and pollutants across the shelf or import them from the deeper Gulf of Mexico onto the West Florida Shelf. Additionally, historical records of these physical quantities can be used to anticipate future conditions. For example, tides can be accurately predicted using harmonic analysis of sufficiently long records, and more skillful forecasts can be made using high-resolution nowcast/forecast numerical circulation models such as those for the West Florida Shelf and for Tampa Bay that require accurate measures of the above physical quantities for boundary conditions, veracity testing, and data assimilation.

The most common types of instrument platforms are fixed locations (moorings), mobile platforms (gliders), and coastal oceanographic High-Frequency radars (HFR). Moorings are used to maintain the horizontal position of instruments at selected depths using a system of floats, chains, and cables anchored to the bottom or, in shallow water, rigid structures embedded into the seafloor. Autonomous Underwater Gliders (AUG) are an enhanced version of profiling floats that employ wings, a tail fin, and, optionally, a variable-speed propeller to move through the water column. Driven by a buoyancy pump, they profile the water column in a sawtooth pattern. Gliders are a relatively new, rapidly evolving instrument platform, and their data present new challenges due to their dynamic nature. Shore-based HFRs remotely sense and map surface currents with extensive spatial coverage and at high sample resolution, which enable use in many oceanographic applications as well as for assimilation in numerical ocean circulation models. Surface currents are relevant to seafloor studies because ocean circulation is fully three-dimensional, and surface and near-bottom current variations are dynamically related.

Pressure is most often measured using transducers, electrical devices that convert pressure-induced distortion (such as experienced by a diaphragm) to electrical signals. Pressure predominantly varies with water depth (D) through the hydrostatic equation. Temperature is typically measured by running a current through a thermistor, a resistor whose temperature response is known. Salinity is computed from electrical conductivity (C) and temperature measures, which are then converted to salinity following established equation of state algorithms. It is often advantageous to combine these into a single CTD instrument. Currents are most frequently measured using an Acoustic Doppler Current Profiler (ADCP) that monitors changes in the frequency of sound wave packets emitted by and then reflected back to the instrument by scatterers in the water column. The primary sensor on a glider is a streamlined, externally integrated CTD mounted to the hull that samples at one-second (adjustable) intervals as the glider navigates the water column. Additional navigation instruments are a GPS antenna to obtain position during surfacing, a compass for underwater navigation via glider attitude and heading, and an altimeter (sonar) for bottom detection, allowing gliders to swim down to within a few meters of the seafloor. The latter instrument has seafloor applications as it can be paired with glider depth to determine seafloor depth and derive coarse bathymetry maps through the interpolation of a collection of mission data points. Areas with frequent mission flyovers, especially areas of high interest where a glider might do loops or several passes within a single mission, yield data density sufficient to resolve large-scale features. HFRs provide a means for mapping fields of surface velocity vectors by using the Doppler shift in frequency for radio frequency waves scattered off surface gravity waves. From these returns, a set of radial velocity components is calculated. Given two independent systems deployed at two different locations along the shoreline, these radial velocity components are combined to give a field of surface velocity total vectors.

Data Processing

Ocean physics data are most commonly sampled at discrete points in time. Time series data are useful to quantify variations over a time span, either short term variations, for instance due to winds and tides, or longer term variations due to seasonal impacts, El Niño/La Niña phases, or climate change. Temperature, salinity and current data are extremely useful to validate computer models, either after a simulation has completed, or during a simulation by assimilating data directly and correcting the computer model as it is running. Once data has been collected and verified, the challenges may be to explain anomalies, to reconcile differences between regions and/or time periods, or to quantify previously unknown physical processes. Errors in received data occur and typically require semi-automated or manual detection and correction customized for each data type. Strictly automated data processing is not advised as conditional testing (e.g., values outside a user-defined acceptable range) may not detect all forms of error or, conversely, may exclude rare oceanographic phenomena that produce values outside the specified bounds.

The first step for temperature and salinity data is often to construct a time vector and a data vector from the measurements. Then, the continuity of the time base is checked for duplicate values, jumps, and outliers. Duplicate values should be flagged or removed, and so should outliers outside the instrumental range or the understood limits of local physical processes. The latter requires expert input for each variable type. Outlier detection typically starts with detecting values beyond some threshold, but a high-pass filter can be applied to remove relatively low-frequency signals produced by physical processes. An additional issue arises with gliders. While temperature and pressure measurements obtained by the CTD on a glider are generally reliable, the measured salinity can require customized post-processing. In some glider models, there can be a significant thermal lag in the conductivity sensor when traversing rapid temperature changes at the thermocline during a dive or climb. Issues associated with thermal lag can be identified for individual glider models through the manufacturer.

 

Due to the sampling method of ADCPs, the quality control procedure for current data has some initial steps that are not required for temperature and salinity data. ADCPs measure ocean current speed and direction, generally at multiple depths, by emitting ultra-sonic sound waves at a fixed frequency from a collection of emitters. As the sound propagates, it is reflected off suspended particles moving with the currents back to a receiver co-located with the emitters. The movement of the particles creates a Doppler shift, altering the frequency of the reflected waves that is interpreted by the receiver and transformed into current speed and direction. Multiple emitters are required to calculate the three-dimensional current. The return time translates to distance and then depth in the water column. Averages of return signals are taken over discrete ‘bins’ of depth, providing current speed and direction at multiple depths. Time series of a velocity component are then used to detect and sometimes correct potential problems with the measured values. As mentioned above, no standard protocol exists, and every institution or operator can adopt alternate methods. Interference due to reflections from the surface, from the bottom, or from any moorings cables or instruments between the ADCP and the bottom must be identified and flagged or removed. This is generally done by examining the reflection amplitude or computed current speeds for unrealistic values or decreased signal-to-noise ratios. 


Finally, hourly HFR radial data are processed using manufacturer-supplied software.

Data Management

There are many data formats useful for archiving and distributing oceanic data, including comma-separated (.csv), Microsoft Excel (.xlsx), dBase (.dbf), NetCDF (.cdf or .nc), Matlab (.mat), and plain text (.txt). NetCDF standards are often preferred for processed data as this format is suitable for submission to federal archives such as the National Centers for Environmental Information. NetCDF is a widely used, machine-independent, binary format that permits the description of the data contained in the file, and does not require the use of proprietary software to access. The University Corporation for Atmospheric Research (UCAR) leads NetCDF standards development and updates. The set of NetCDF conventions called Climate and Forecast (CF) data and metadata standards is commonly used for oceanographic data from the West Florida Shelf and many other locations. This standard generally divides data into two sections. First is a header that contains metadata information such as data dimensions, names, types, valid ranges, reference time, funding, or program information. Then, there is a data section that contains the values of the variables described in the header and may include arrays for times of measurement and spatial coordinates of the data. IOOS provides a free CF-compliance checker (IOOS Compliance Checker; https://compliance.ioos.us/index.html).

© 2025

bottom of page