Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

ESDIS Standards Office - Developing and Emerging Standards and Practices

 


Abstract/Agenda: 

The NASA ESDIS Standards Office (ESO) coordinates and advises on standards activities within ESDIS and across the ESDS funded information data systems.  In addition to reviewing and recommending community EO data system standards and practices for endorsement for use within the NASA funded missions and data systems, the ESO will also assist in reviewing and evaluating emerging and developing standards for potential future use.
ESDIS Standards Office’s current Standards Process[1] requires complete documentation of a candidate standard as well as examples of implementations before it is reviewed for approval as a NASA ESDS standard.
 
But what happens when NASA wants to evaluate and support a developing or emerging standard?   Quite often the documentation is not yet complete and experience with implementations can be very limited.  Instead of waiting for the developing standard to “firm up” or for multiple implementations to be developed over time (and this can be a lengthy time period), sometimes the ESDIS project has an interest in speeding up this process.  Timely evaluations of a developing standard would have a beneficial impact on its development both in terms of technical excellence and also to speed up the “firming up”.
 
In this session, we will look at several examples of developing standards, hear technical presentations for each, and then discuss how they can be evaluated during the development process.   Additional examples of developing and emerging standards offered by the session attendees will also be discussed.   The goal is to gather information that will help us define a process that can help ESDIS identify and foster promising developing standards.

Introduction to ESO - Yonsook Enloe

Case 1:  Standard specification is out of date (GeoTIFF) – Ted Habermann

  • The new specification will be updated to match existing implementations (This is a similar situation to the OPeNDAP spec we approved a few years ago)
  • Once the specification has been updated, a "classic" Standards Process review (complete documentation, multiple implementations) can be initiated
  • But in this case, how can we ensure the spec matches the existing implementations?
    • Target reviewers who are familiar with current implementation software and can provide detailed technical review of the new spec?
    • Should ESDIS develop a test suite?   Wait for/rely on someone else (e.g., OGC) to do so?

 
Case 2:  Complex and evolving specification (NASA flavor of ISO geospatial metadata: MENDS III) – Andy Mitchell

  • Not a true “specification,” rather an ECHO to ISO mapping of collection & granule metadata
  • Documentation review is difficult and there are few experts.  In general, experts who know ISO do not know the details of the product metadata and vice versa
  • Prototype the ECHO to ISO mapping in ECHO.  Ask data center staff who are very knowledgeable about their own satellite data products to view the metadata in both ECHO and ISO formats and provide comments
  • Data center testers:  best detailed review of the ECHO to ISO mapping to date.

 
Case 3:  Multiple competing specifications (OpenSearch) – Doug Newman

  • Basic OpenSearch specification from opensearch.org
  • ESIP OpenSearch Best Practices document with geospatial, temporal and parameter extensions, hierarchical dataset/granule search
  • OGC specs for geospatial and temporal extensions & spec for parameter extensions.
  • CWIC OpenSearch Best Practices (uses concepts from all docs) & implemented by GCMD, CWIC, & CWIC-Smart (client).
  • TBD CEOS OpenSearch Best Practices (based on OGC specs, basic OpenSearch spec, and uses 2 level search pioneered by ESIP OpenSearch) – expected to be supported by many major space agencies data systems and to be supported by GCMD, CWIC, and CWIC-Smart client.
  • How to speed up the “firming” of the CEOS OpenSearch Best Practices spec?
  • What do we do with all the OpenSearch variations- many of which have already been implemented?
  • Can we foster some interoperability or at least disambiguation among OpenSearch variations to cut down on confusion by casual users?

 
Case 4:  Webification Science (w10n-sci) – Zhangfan Xing

  • Developing an enabling web service technology
  • Defined API and implementations
  • What are good ways to evaluate enabling web service technology?
  • What does it mean to make an enabling web service technology a standard or best practice?

 
Open Discussion – All

  • What type of evaluation makes sense for each case?
  • What type of evaluation can help speed up the development process or improve the technical content of the standard?
  • These cases – are there commonalities we can turn into any kind of process?
  • Are they too different, there is no way to do a common process, but each is important and can be considered to fit a pattern?
  • How can ESO best foster these? ESO can’t write code or host implementations. Is information exchange/awareness raising enough?
  • Are there other things out there that people would like to see getting some attention?

[1] http://earthdata.nasa.gov/data/standards-and-references/standards-proces...

Hide comments