IBCAO icon.
IOC/IASC/IHO Editorial Board for the International Bathymetric Chart of the Arctic Ocean
Report of Meeting Copenhagen, Denmark October 19-20, 1998

APPENDIX E

PROPOSED STRATEGIC PLAN FOR DEVELOPING A MODERN
DATA BASE AND MAP OF ARCTIC BATHYMETRY

The IASC/IOC/IHO Editorial Board for the
International Bathymetric Chart of the Arctic Ocean

October, 1998

INTRODUCTION

This document outlines a general approach for a project to assemble all available bathymetric data north of 64°N, for the purpose of constructing a coherent digital data base and an accurate map of the sea floor. The ideas presented here were developed from information presented at the Arctic Bathymetry Workshop held September 18-19, 1997 in St. Petersburg, Russia (Macnab and Grikurov, 1997) and from subsequent exchanges between members of the Editorial Board.

CANDIDATE DATA SETS

Appendix 4 of the Workshop Report contains encapsulated descriptions of known data sets that were identified as potential components of the proposed data base. Collected, compiled, and archived by various agencies in the countries represented at the Workshop, these data sets for the most part are physically stored in numerous separate locations. They define ocean depths across a broad range of continental shelf and deep sea localities, having been acquired with varying levels of accuracy as point soundings taken through the ice cover, as sequential single-beam observations collected by surface vessels and submarines, and to a lesser extent as surface multibeam measurements.

The combined holdings cover a time span that ranges over the past several decades. With some data sets preserved in digital form and others in analog form, these holdings comprise a substantial mix of storage media and formats: paper sounding records; paper maps inscribed with point observations or contour lines; and digital files holding original observations, processed observation, contours, or grids. Levels of data treatment are highly variable and inconsistent, with some observations having been thoroughly processed for mapmaking and research purposes, and others remaining, to all intents and purposes, in their raw, original form. Some data sets have been placed in the public domain and are readily accessible, while others remain proprietary with restrictions on their release and distribution.

A particular instance of information in the public domain is the published map that represents the best efforts of a group or of an individual to assemble, combine, and interpret all available data. Part or all of the map's constituent data sets may not be releasable into the public domain, however if sufficient information is available to qualify the map and to provide a measure of confidence in its overall reliability, then that map might be acceptable as a valid data source for the project.

In addition to the data sets that have already been collected, the US Navy's unclassified SCICEX program has this year mobilized a swath mapping capability, which can be expected to increase significantly the quantity and quality of unclassified observations that will be incorporated eventually in the Arctic data base.

PARTITIONING THE PROJECT ALONG GEOGRAPHIC LINES

Under ideal conditions, a project such as the one envisaged here would be the responsibility of a single organization that possessed both the capability and the resources to perform effectively all the constituent tasks. This approach would have two major advantages: (1) simplified project management and control; (2) consistent treatment of the data sets.

Realistically, however, there are three significant constraints to implementing this approach: (1) while most if not all of the participating organizations possess the technical capacity to undertake the entire project, few if any appear to have at their disposition all the necessary human and financial resources, or to enjoy a formal mandate for assuming total responsibility for an activity of such broad international scope; (2) in the form of original observations, certain data sets have restricted mobility, and are unlikely to be released for free and unrestricted exchange; (3) in certain areas, the results of the undertaking could have significant implications with respect to continental shelf delimitation, giving prospective claimant states a strong incentive to assess original data sets, and to be directly involved in their handling and interpretation.

figure icon.

A proposed solution is to partition the project between the international High Seas and the Zones of National Interest (ZNIs) that pertain to each participating coastal state. These are shown in generalized form in Figure 1. In essence, each coastal state would assume the primary responsibility for assembling and treating proprietary and public data within its ZNI; the High Seas would be treated on a collective basis, using public data only. Where their ZNIs were contiguous, states would be encouraged to exchange data with their neighbours, and to work together closely to ensure a seamless portrayal of the sea floor from one zone to another - hence Canada would share data with the USA and Denmark, Denmark would share with Canada and Norway, and so on. The precise methodology for handling and treating data in the central portion of the Arctic Ocean would require further discussion and definition.

The above plan would respect regional data sensitivities within ZNIs, while promoting the free exchange of information in the regions that fall within the High Seas.

PRODUCTS

As envisaged, the project will develop a range of products subject to varying degrees of distribution:

  • Digital data bases comprising a mix of original observations (public and previously unreleased measurements) collected within each national zone of interest; depending on national policies, these may or may not be released into the public domain for unlimited distribution.
  • A digital data base containing original public observations from the High Seas areas; this will be released for unlimited distribution and periodic updating.
  • Digital bathymetric values distributed over a uniform grid that covers the entire project area (the spacing between grid values and the technique for their derivation will be determined by consultation among all project participants); this information will likely be distributed on CD-ROM, along with original data sets that are deemed releasable for unlimited circulation.
  • Printed map(s) that portray bathymetry in isobath and/or shaded relief form, preferably at a scale of 1:6 Million to replicate GEBCO Sheet 5.17, however other scales are possible.
  • Documentation that describes the data sets, the distribution of observations, and the treatments applied to develop the gridded data set. The documentation will be released for unlimited circulation in printed form and as a text/graphic file on the CD-ROM bearing the gridded data.

GENERAL PROCEDURES

As a general rule, observations in digital form are preferable to analog information. However it is recognized that numerous legacy data sets are likely to exist in analog form only (original sounding records, posted depths, contour maps, etc) and these should be converted to digital form at an early stage of the operation. Thereafter, a consistent suite of digital techniques should be used to treat data sets at all subsequent stages. These operations include: converting all data sets to a common format; identifying and correcting errors; analyzing observational discrepancies at track intersections; adjusting and re-levelling data sets to achieve agreement where they abut or overlap; and deriving general statistical parameters for qualifying the overall data base.

Depending on the density and other characteristics of the data, one or more gridding algorithms may be needed to construct surfaces that contain a statistically significant percentage of all original data points within specified limits. For instance, where original data points are numerous and accurate, grids will be derived directly from these observations; where data points are sparse or of poor quality, it may prove necessary to derive grids from hand-drawn contours that incorporate varying levels of human judgement and interpretation. Members of the Editorial Board will collectively review gridding algorithms with a view to selecting those that are best suited to the demands of the application.

RELATED TOPICS

Standard coastline. While it is probably the most available of global coastlines in the public domain, the digital WVS (World Vector Shoreline) demonstrates in many Arctic locations a significant lack of agreement with shorelines derived from other sources, such as national maps and publications. Accordingly, members of the Project Group are encouraged to consider ways of rendering the Arctic shoreline at the highest possible accuracy. This will likely entail efforts to obtain the latest and most reliable information from national mapping authorities, and to composite that information in a manner that yields a significant improvement over existing public domain portrayals.

Digital Terrain Model (DTM). Land areas occupy a sizeable portion of the proposed map. For uniformity of presentation and to facilitate correlation between marine and continental features, it will be desirable to portray morphology both above and below sea level at comparable levels of resolution. Land elevations in several regions of the proposed map are already defined by digital DTM's that exist in the public domain; elevations for the remaining regions may be obtainable from national sources.

PROVISIONAL TIMETABLE

Anticipated progress will depend upon several factors, however a desirable objective for 1999 is to have project components completed, to a preliminary stage at least, in the High Seas and in the zones of national interest. Following an internal review by members of the Project Group, these components will be consolidated with a view to creating final products for public distribution by the year 2000.

Following is a provisional outline of milestones and operations:

October 1998. First meeting of Project Group/Editorial Board: define specifications; establish work plan; identify individuals who will assume responsibility for specific project components.

November 1998 to September 1999. Participants assemble information and develop components for which they have accepted responsibility.

October 1999. Second meeting of Project Group/Editorial Board: review completed components; identify problem areas and devise solutions; develop plan for merging components and for developing final products.

November 1999 to September 2000. Refine and combine components, construct final products, document data sets and procedures.

October 2000. Third meeting of Project Group/Editorial Board: review and approve final products; initiate their distribution; develop a long-term strategy for ongoing maintenance of the data base, and for regular updates to the grid and map.

MEMBERS OF THE EDITORIAL BOARD
     (group alias: arctic-bathy@ldeo.columbia.edu)

Harald BREKKE, Norwegian Petroleum Directorate
     <Harald.Brekke@npd.no>

Norm CHERKIS, US Naval Research Laboratory
     <cherkis@qur.nrl.navy.mil>

Bernie COAKLEY, Lamont-Doherty Earth Observatory
     <bjc@ldeo.columbia.edu>

Valeriy FOMCHENKO, Head Dep't of Navigation and Oceanography (GUNiO)
     <gunio@g-ocean.spb.su>

Garrik GRIKUROV, VNIIOkeangeologia
     <garrik@g-ocean.spb.su>

Hilmar HELGASON, Icelandic Hydrographic Service
     <hilmar@lhg.is>

Martin JAKOBSSON, Stockholm University
     <martin.jakobsson@geo.su.se>

Ron MACNAB, Geological Survey of Canada
     <macnab@agc.bio.ns.ca>

Sergei MASCHENKOV, VNIIOkeangeologia
     <mascha@vniio.nw.ru>

Hans-Werner SCHENKE, Alfred Wegener Institute
     <schenke@awi-bremerhaven.de>

John WOODWARD, Royal Danish Adm'n of Navigation and Hydrography
     <jjw@fomfrv.dk>

Return to Copenhagen Report Page