Background: Read more about the NetCDF Attribute Convention for Dataset Discovery and how it is used in ncISO here.

Updates: August 16th, 2012

NEW DEVELOPMENTS Desktop JAR code updated to reflect changes initially implemented for THREDDS.

ncISO: A command-line utility for automating metadata analysis and ISO metadata generation for THREDDS Catalogs


The ncISO Tool traverses THREDDS Catalogs, reads dataset documentation and translates that documentation into different views using Extensible Stylesheet Language Transformations (XSLT). Two stylesheets are currently supported:
  1. A graphical comparison (rubric) of existing documentation with the Unidata Data Discovery Conventions with guidance on using those conventions to facilitate data discovery; and
  2. Translation of discovery elements from NcML into ISO 19115.
  1. Download ncISO Version 2.3
  2. Extract the jar into a local directory.
  3. Open a shell window or command prompt from the directory that contains the extracted files.
  4. Enter the command java –jar ncISO.jar to see available ncISO arguments and descriptions of how to use them.
  5. Enter the ncISO command with appropriate arguments.
Prior versions of ncISO are available.


–Xms1024m and –Xmx1024m: Standard java elements for specifying the amount of memory to allocate to the ncISO utility. In this case 1024 megaBytes are specified for initial and maximum memory.

–ts THREDDS_CATALOG_URL: specifies the URL of the THREDDS catalog to process.

–num N: specifies the number of datasets to process per branch. Specifying a small number of datasets/branch, as in this case, results in a fast sample scan that is representative in THREDDS catalogs with generally homogeneous content in each branch. Specify a large number for a translation of all content.

–depth 20: limits the crawlers descent into the catalog.

–iso: signals to the crawler to generate ISO.

–waf ROOT_WAF_FOLDER: signals the crawler to dump files to a flat WAF structure.

–custom: signals to the crawler to translate the NCML using a custom stylesheet.

–xslt: XSLT_FILENAME located in an xslt subfolder.


Crawl NOAA's NGDC THREDDS catalog and generate metadata:
java -Xms1024m -Xmx1024m -jar ncISO-2.2.2.jar -ts -num 1 -depth 20 -iso true

After the utility has completed running, a thredds directory will be created storing NCML, a metadata report and ISO xml for each NetCDF dataset that was located. In addition a thredds.json file is generated that allows a tree based display in your browser. In order to view these results in the browser copy the nciso.html, thredds.json and directory structure into a web accessible location. Now open nciso.html to visualize and access the results in your browser, for example: http://localhost/yourpath/nciso.html.

To see the results of some sample THREDDS based datasets (crawled in the Spring 2010) that we have crawled click on the links below:

UAF metadata assessment
OceanSITES metadata assessment
ESRL metadata assessment
NGDC metadata assessment


Please send bug reports or feature requests to david.neufeld<at>

threddsISO: A THREDDS Data Server extension which generates NCML, a metadata rubric, and ISO 19115


threddsISO adds three new service types to your existing THREDDS server. The three new services are:
  1. NCML: An NCML service that adds geospatial extent information if it has not already been documented.
  2. UDDC: A metadata analysis service which scores the quality of your NCML compared to the Unidata Data Discovery Conventions.
  3. ISO: An ISO 19115 metadata service.
  1. Follow the TDS instructions available here.
For incremental or bug fix releases...
  1. A bug fix release is available here.
  2. Copy the latest threddsIso-*.jar into the thredds/WEB-INF/lib directory.
  3. Copy the latest UnidataDD2MI.xsl from here into the thredds/WEB-INF/classes/resources/xsl/nciso directory.
  4. Restart THREDDS or the web application container.
Prior versions of threddsISO are available.


To see an example implementation of the threddsISO extension click on a link below:

SST Aerosol data
Unidata samples data


Source code for the library is available here.