Validation-related use cases

This work is still unfinished!!

Primary users

INSPIRE data provider / user (national gov. level administration)

UC-1: Provide conformant metadata for a new View Service

Background: The User is responsible for ensuring that the INSPIRE services provided by the Organization are compliant with the INSPIRE IR and TG requirements. A new View Service for Administrative Units is about to be published by the organizations's IT department next week.

User goal to accomplish: Create and publish an INSPIRE compliant metadata record for the new service and the provided AU data sets for Netherlands.

Scenario 1: Using INSPIRE Geoportal

Initial state: The metadata record has been created and is stored as a XML file on the local computer. There is an error in the coupled resources of record: the link to one of the data sets provided by the service points to a non-existent CSW record. The User has registered in the INSPIRE Geoportal, and has provided an email address.

  1. Workflow steps:
  2. The User opens the validator application page of the Geoportal
  3. The User selects a View Service as the validation target. The Application shows the available test suites for validating View Services.
  4. The user selects ETS for service metadata (the most recent MIG-approved version)
  5. The Application prompts the user to select either uploading the metadata file or providing an URL for accessing it.
  6. The User selects the file upload option and uploads the metadata file from the local disk.
  7. The Application gives an estimate on how long the validation is expected to take at the moment, and asks for an email address to send to validation result when it's ready. The default email address of the registered user is provided as a default.
  8. The User confirms the email address and starts the test run.
  9. The Application sends an email about the results of the test run to User as soon as the test run has completed. The email contains the overview the test result in textual form, and a direct link to the Geoportal page with the detailed test run results.

Final state: The User has got a "fail" result of the Service metadata test run with the necessary information to find the row number of the original document, the name of the failing element, and the reason for the failure (no data set MD record found using the given URL), and the exact reference to the original requirement and containing specification document.

Scenario 2: Using a desktop metadata editor

Initial state: The User is editing the service metadata record with a desktop application able to do on-the-fly INSPIRE compliance validation on the record structure and content. The desktop application contains a stand-alone software Library with the INSPIRE validation framework code and the latest and the previous MIG approved metadata ETS modules. The library is capable of running the test runs requested by the editor on the local computer and returning the results to the editor code component without accessing a remote server for the test content or remote execution. The library is capable of fetching external resources necessary for running the tests (like linked metadata records or links to the service end points).

The User has almost completed editing the metadata record. She is just about to add the coupled resource links for the data sets provided by the service.

  1. Workflow steps:
  2. The User adds an element for the coupled resource assisted by the editor application. The Application indicates that there is an error in it due to missing URL for the dataset metadata record URL.
  3. The User types in the URL of the metadata record, but makes a typo in the record ID.
  4. The Application asks the Library to run the relevant tests for the record compliance in the background, and receives the result with the rows, the names of the failed elements, and the error codes for each error.
  5. The Application interprets the error codes into error messages in User's language, and updates it's user interface to indicate the failed rows and elements, as well as the error messages for each error.

Final state: The User is able to clearly see that there is something wrong with the URL of the added data set metadata record, but that otherwise the metadata record is compliant with the INSPIRE IR and TG requirements for View Service metadata records.

Scenario 3: Using a remote INSPIRE validation service integrated into an internal publishing workflow

Initial state: There is a automatic workflow process for publication and verification of metadata records. Part of the workflow is a step that submits the test object (metadata record) to a remotely running INSPIRE validator service by calling it's API operations.

  1. Workflow steps:
  2. The User creates a draft of a new service metadata record and submits it to the workflow.
  3. The workflow process sends the metadata record to the remote INSPIRE validator service using an API.
  4. The validator service returns a notification endpoint/token for checking the status of the test run.
  5. When the validator service has execution time available, it will run the most recent ETS tests for service metadata on the submitted document.
  6. The workflow process waits for the final status of the test run from the remote validator service.
  7. When the workflow process receives the results form the API operation (in machine readable fromat), it turns it into a human-readable, easy-to-understand report and send that to the User.

Final state: The User has received an error report from the automatic workflow process describing the the "failure" result of the Service metadata test run with the necessary information to find the row number of the original document, the name of the failing element, and the reason for the failure (no data set MD record found using the given URL), and the exact reference to the original requirement and containing specification document.

UC-2: Check that a View Service is still INSPIRE conformant after server software update

UC-3: Collect and merge up-to-date INSPIRE Protected Sites data sets for Nordic countries

relation to validation: ability to ensure that the services and the data sets are INSPIRE conformant before including them as data sources TODO

UC-4: Publish a predefined, INSPIRE compliant Land Cover dataset for Finland

TODO

UC-5: Verify that the potential new WFS server software can be configured correctly to provide a compliant INSPIRE Download Service

TODO

National INPIRE contact point

UC-6: Monthly check for Finnish INSPIRE Network Services

TODO

UC-7: Metadata, services and data sets compliance check for the yearly INSPIRE Reporting

TODO

Secondary users

Test developer

UC-8: Create a missing test for View Services WMTS ETS

TODO

UC-9: Fix a test in Download Services ETS

TODO

UC-10: Create a new Executable Test Suite based on an existing ATS tests

TODO

QoS tool software provider

UC-11: Add support for a new test suite for INSPIRE View Services

TODO

UC-12: Update the validation error messages to match the ETS new version

TODO

MIG-T conformity testing subgroup member

UC-13: Approve a change in a test as part of the next ETS release

TODO

UC-14: Get a status report of the current coverage of the MIG approved ETSes

TODO

UC-15: Get a report for new and changed tests to be MIG approved for the next release of WMTS ETS

TODO

Network Service software provider

UC-16: verify that the new product version is able to function as an INSPIRE compliant WMS

TODO