Error Analysis Procedures Used by the National Ocean Service to Compute Estimated Error Bounds for Tidal Datums in the Arctic Ocean

Error Analysis Procedures Used by the National Ocean Service to Compute Estimated Error Bounds for Tidal Datums in the Arctic Ocean PDF Author: Michael F. P. Michalski
Publisher:
ISBN:
Category : Error analysis (Mathematics)
Languages : en
Pages : 20

Get Book Here

Book Description
"NOAA has an established National Water Level Observation Network (NWLON) along all U.S. coastlines. One purpose of the NWLON is to provide control for tidal datum determination at short-term water level stations installed for hydrographic and shoreline mapping surveys. There are significant gaps in NWLON coverage in Alaska. When short-term (1-12 months) water level stations are installed outside of an NWLON coverage area, a First Reduction (FRED) or arithmetic mean is used for datum determination instead of the preferred simultaneous comparison method that uses a nearby NWLON station to compute a 19-year equivalent National Tidal Datum Epoch (NTDE) datum. The datum error of a FRED is typically greater than the error computed by a simultaneous comparison procedure with an NWLON station. This report describes one method used by NOAA to establish error bounds on FRED tidal datums computed at short-term stations. The standard deviation of monthly Mean Tide level (MTL) at 29 operating and historical water level stations in Alaska with varying time series lengths was used to infer FRED datum errors within the study region. The combined results show that FRED datum errors decrease from 0.120 m, to 0.040 m and 0.008m (one-sigma) for 1, 12, and 228 month time series, respectively. Comparisons across the region show only minor statistical differences, supporting the use of combined values as representing FRED datum errors for the entire study area. These results will help facilitate better estimates of total tide-propagated error and better planning of required subordinate installation time series length in support of hydrographic and shoreline mapping surveys in Alaska"--Executive Summary.

Error Analysis Procedures Used by the National Ocean Service to Compute Estimated Error Bounds for Tidal Datums in the Arctic Ocean

Error Analysis Procedures Used by the National Ocean Service to Compute Estimated Error Bounds for Tidal Datums in the Arctic Ocean PDF Author: Michael F. P. Michalski
Publisher:
ISBN:
Category : Error analysis (Mathematics)
Languages : en
Pages : 20

Get Book Here

Book Description
"NOAA has an established National Water Level Observation Network (NWLON) along all U.S. coastlines. One purpose of the NWLON is to provide control for tidal datum determination at short-term water level stations installed for hydrographic and shoreline mapping surveys. There are significant gaps in NWLON coverage in Alaska. When short-term (1-12 months) water level stations are installed outside of an NWLON coverage area, a First Reduction (FRED) or arithmetic mean is used for datum determination instead of the preferred simultaneous comparison method that uses a nearby NWLON station to compute a 19-year equivalent National Tidal Datum Epoch (NTDE) datum. The datum error of a FRED is typically greater than the error computed by a simultaneous comparison procedure with an NWLON station. This report describes one method used by NOAA to establish error bounds on FRED tidal datums computed at short-term stations. The standard deviation of monthly Mean Tide level (MTL) at 29 operating and historical water level stations in Alaska with varying time series lengths was used to infer FRED datum errors within the study region. The combined results show that FRED datum errors decrease from 0.120 m, to 0.040 m and 0.008m (one-sigma) for 1, 12, and 228 month time series, respectively. Comparisons across the region show only minor statistical differences, supporting the use of combined values as representing FRED datum errors for the entire study area. These results will help facilitate better estimates of total tide-propagated error and better planning of required subordinate installation time series length in support of hydrographic and shoreline mapping surveys in Alaska"--Executive Summary.

Error Analysis Procedures Used by the National Ocean Service to Compute Estimated Error Bounds for Tidal Datums in the Beaufort Sea, Arctic Ocean

Error Analysis Procedures Used by the National Ocean Service to Compute Estimated Error Bounds for Tidal Datums in the Beaufort Sea, Arctic Ocean PDF Author: Wiliam M. Stoney
Publisher:
ISBN:
Category : Beaufort Sea
Languages : en
Pages : 36

Get Book Here

Book Description


The Proceedings Of The Coastal Sediments 2015

The Proceedings Of The Coastal Sediments 2015 PDF Author: Jun Cheng
Publisher: World Scientific
ISBN: 981468998X
Category : Technology & Engineering
Languages : en
Pages : 3129

Get Book Here

Book Description
This Proceedings contains over 260 papers on cutting-edge research presented at the eighth international Symposium on Coastal Sediment Processes, held May 11 - 15, 2015, in San Diego, California, USA. This technical specialty conference was devoted to promoting an interdisciplinary exchange of state-of-the-art knowledge among researchers in the fields of coastal engineering, geology, oceanography, and related disciplines, with the theme of Understanding and Working with Nature.Focusing on the physical aspects of the sediment processes in various coastal environments, this Proceedings provides findings from the latest research and newest engineering applications. Sessions covered a wide range of topics including barrier islands, beaches, climate and sea level, cohesive and noncohesive sediments, coastal bluffs, coastal marsh, dredged sediments, inlet and navigation channels, regional sediment management, river deltas, shore protection, tsunamis, and vegetation-sediment interaction. Several special sessions included: Relevant science for changing coastlines: A Tribute to Gary Griggs; North Atlantic Coast Comprehensive Study and post-super-storm Sandy work; long-term coastal evolution; barrier islands of Louisiana; sea-level rise and super storms in a warming world; predicting decadal coastal geomorphic evolution; and contrasting Pacific coastal behavior with El Niño Southern Oscillation (ENSO), are also featured.

Implementation of Procedures for Computation of Tidal Datums in Areas with Anomalous Trends in Relative Mean Sea Level

Implementation of Procedures for Computation of Tidal Datums in Areas with Anomalous Trends in Relative Mean Sea Level PDF Author: Stephen K. Gill
Publisher:
ISBN:
Category : Sea level
Languages : en
Pages : 58

Get Book Here

Book Description
"NOAA has typically updated tidal datum elevations for the nation to new National Tidal Datum Epoch (NTDE) time periods every 20-25 years. Updates at this frequency are necessary due to long-term global sea level change. In 1998, NOS recognized the need for a modified procedure that utilized more frequent time period updates, for determination of tidal datums for regions with anomalously high rates of local relative sea level change. These localized effects in relative sea level trends are typically due to different forces other than those responsible for global trends which can vary significantly from global trends in both time scales and magnitude. This modified procedure is necessary at selected stations to ensure that the tidal datums accurately represent the existing stand of sea level relative to land on which these datums are held fixed. Bench mark monuments are typically used as reference points for numerous applications requiring tidal datum references. The modified procedure is limited only to those stations with documented anomalous relative sea level trends due to high rates of vertical land motion. Anomalous relative sea level trends are seen along the central Louisiana, the southern Cook Inlet, and the southeastern Alaska coasts. For example, the magnitude of the sea level trends in these areas are +9.24 mm/yr at Grand Isle, LA; -9.45 mm/yr at Seldovia, AK; and -12.92 mm/yr at Juneau, AK. Following the first implementation of the modified procedure in 1998, using the time series for tidal datum computation of 1990-1994, sea level analyses in these anomalous regions are now conducted approximately every five (5) years to identify stations that require datum updates using the modified procedure...The purpose of this technical report is to document the Modified Procedure that has been used by CO-OPS to compute accepted tidal datums for selected regions having anomalously high rates of local relative sea level change. Additionally, the report provides an update on near-term plans for continuing to implement the procedure"--Executive Summary.

A Comparison of Datums Derived from CO-OPS Verified Data Products and Tidal Analysis Datum Calculator

A Comparison of Datums Derived from CO-OPS Verified Data Products and Tidal Analysis Datum Calculator PDF Author: Louis A. Licate
Publisher:
ISBN:
Category : Mathematical analysis
Languages : en
Pages : 18

Get Book Here

Book Description
"The NOAA National Ocean Service Center for Operational Oceanographic Products and Services (CO-OPS) has developed a publicly accessible tool to compute tidal datums from water level data with a variety of tidal signals. The Tidal Analysis and Datums Calculator (TAD) uses a Butterworth digital filter to remove high frequency (> 4 cycles/day) water level variability in order to identify tidal high and low waters from observed water level data. Present CO-OPS procedure uses a Curve Fit Manual Verification (CFMV) approach to identify tidal high and low waters. A comparison of high and low water selections at eight long-term NOAA water level stations shows that the mean difference between selections made by TAD and CFMV have a mean bias of 0 at the 1 mm level, and the standard deviations of the differences are all within CO-OPS-accepted data processing error bounds. Instances of major differences (> 0.02 m) between individual high and low water selections are rare and have no significant influence on the resulting datums. The difference in errors associated with tidal datums computed by TAD and CFMV is less than 0.002 m when compared to the published tidal datums at the eight stations. The results here demonstrate that TAD is able to efficiently determine accurate high and low water values without manual verification. Therefore, users of this new tool will be able to generate consistent and reproducible tidal datums that are useful for coastal planning and restoration"--Executive summary.

An Error Analysis of Range-Azimuth Positioning

An Error Analysis of Range-Azimuth Positioning PDF Author: David A. Waltz
Publisher:
ISBN:
Category : Oceanography
Languages : en
Pages : 104

Get Book Here

Book Description
Pointing error standard deviations for two theodolites, the Wild T-2 and Odom Aztrac, were determined under conditions closely approximating those of range-azimuth or azimuth-azimuth hydrographic surveys. Pointing errors found for both instruments were about 1.3 meters, and were independent of distance. No statistical difference between the errors of the two instruments was found. The accuracy of the interpolation methods used by the National Ocean Service (NOS) for range-azimuth positioning were investigated, and an average inverse distance of about 2.5 meters was observed between interpolated positions and corresponding observed positions. The overall range-azimuth position errors of the two theodolites were then compared to positioning standards of NOS and the International Hydrographic Organization, using assumed ranging standard deviations of 1.0 and 3.0 meters. Both instruments met all standards except the NOS range-azimuth standard for 1:5,000 scale surveys. Interpolated positions may fail to meet more of the standards because of additional inherent error. (Author).

Adaptive Error Estimation in Linearized Ocean General Circulation Models

Adaptive Error Estimation in Linearized Ocean General Circulation Models PDF Author: Michael Yurievich Chechelnitsky
Publisher:
ISBN:
Category : Analysis of covariance
Languages : en
Pages : 232

Get Book Here

Book Description
Data assimilation methods are routinely used in oceanography. The statistics of the model and measurement errors need to be specified a priori. This study addresses the problem of estimating model and measurement error statistics from observations. We start by testing innovation based methods of adaptive error estimation with low-dimensional models in the North Pacific (5-60 deg N, 132-252 deg E) to TOPEX/POSEIDON (TIP) sea level anomaly data, acoustic tomography data from the ATOC project, and the MIT General Circulation Model (GCM). A reduced state linear model that describes large scale internal (baroclinic) error dynamics is used. The methods are shown to be sensitive to the initial guess for the error statistics and the type of observations. A new off-line approach is developed, the covariance matching approach (CMA), where covariance matrices of model-data residuals are "matched" to their theoretical expectations using familiar least squares methods. This method uses observations directly instead of the innovations sequence and is shown to be related to the MT method and the method of Fu et al. (1993). Twin experiments using the same linearized MIT GCM suggest that altimetric data are ill-suited to the estimation of internal GCM errors, but that such estimates can in theory be obtained using acoustic data. The CMA is then applied to T/P sea level anomaly data and a linearization of a global GFDL GCM which uses two vertical modes. We show that the CMA method can be used with a global model and a global data set, and that the estimates of the error statistics are robust. We show that the fraction of the GCM-T/P residual variance explained by the model error is larger than that derived in Fukumori et al.(1999) with the method of Fu et al.(1993). Most of the model error is explained by the barotropic mode. However, we find that impact of the change in the error statistics on the data assimilation estimates is very small. This is explained by the large represenc.

To Err Is Normable: The Computation of Frequency-Domain Error Bounds from Time-Domain Data

To Err Is Normable: The Computation of Frequency-Domain Error Bounds from Time-Domain Data PDF Author: National Aeronautics and Space Adm Nasa
Publisher: Independently Published
ISBN: 9781729195796
Category : Science
Languages : en
Pages : 46

Get Book Here

Book Description
This paper exploits the relationships among the time-domain and frequency-domain system norms to derive information useful for modeling and control design, given only the system step response data. A discussion of system and signal norms is included. The proposed procedures involve only simple numerical operations, such as the discrete approximation of derivatives and integrals, and the calculation of matrix singular values. The resulting frequency-domain and Hankel-operator norm approximations may be used to evaluate the accuracy of a given model, and to determine model corrections to decrease the modeling errors. Hartley, Tom T. and Veillette, Robert J. and DeAbreuGarcia, J. Alexis and Chicatelli, Amy and Hartmann, Richard Glenn Research Center NCC3-508; RTOP 519-30-53

Sea-Level Rise for the Coasts of California, Oregon, and Washington

Sea-Level Rise for the Coasts of California, Oregon, and Washington PDF Author: National Research Council
Publisher: National Academies Press
ISBN: 0309255945
Category : Science
Languages : en
Pages : 274

Get Book Here

Book Description
Tide gauges show that global sea level has risen about 7 inches during the 20th century, and recent satellite data show that the rate of sea-level rise is accelerating. As Earth warms, sea levels are rising mainly because ocean water expands as it warms; and water from melting glaciers and ice sheets is flowing into the ocean. Sea-level rise poses enormous risks to the valuable infrastructure, development, and wetlands that line much of the 1,600 mile shoreline of California, Oregon, and Washington. As those states seek to incorporate projections of sea-level rise into coastal planning, they asked the National Research Council to make independent projections of sea-level rise along their coasts for the years 2030, 2050, and 2100, taking into account regional factors that affect sea level. Sea-Level Rise for the Coasts of California, Oregon, and Washington: Past, Present, and Future explains that sea level along the U.S. west coast is affected by a number of factors. These include: climate patterns such as the El Niño, effects from the melting of modern and ancient ice sheets, and geologic processes, such as plate tectonics. Regional projections for California, Oregon, and Washington show a sharp distinction at Cape Mendocino in northern California. South of that point, sea-level rise is expected to be very close to global projections. However, projections are lower north of Cape Mendocino because the land is being pushed upward as the ocean plate moves under the continental plate along the Cascadia Subduction Zone. However, an earthquake magnitude 8 or larger, which occurs in the region every few hundred to 1,000 years, would cause the land to drop and sea level to suddenly rise.

Physics Briefs

Physics Briefs PDF Author:
Publisher:
ISBN:
Category : Physics
Languages : en
Pages : 770

Get Book Here

Book Description