INSPIRE testing framework - concepts & components¶
The diagram below describes a conceptual model of the central data entities of the INSPIRE Testing Framework as an UML class diagram. Note that this model is not intended as an implementation model and thus the classes do not contain all the attributes required for providing the required functionality of the described system.
High-level components of the INSPIRE Testing Framework¶
This component implements the interactive user interface for the INSPIRE Test Framework Reference Implementation. The most important required functionalities are the following:
- Find suitable test suites to run based on the validated resource type, the specification to conform to (TG, IR), and status (draft, approved, etc.) and the version (latest, specific) of the test suites.
- Submit a resource for validation against a given specification (local file upload or URL reference to an endpoint)
- Execute test suites or individual tests for the given resources.
- Monitor the status on the currently running or queued tests (login required).
- Receive notifications for completing the executed tests and receiving the validation results.
- Find results of previously executed tests by resource identifier and re-run the tests using currently avaiable or user-provided versions of the resources to validate (login required).
- Registering new Specifications, Conformance Classes and Abstract Test Suites (see concept diagram above) and their versions.
The Validation UI delegates the operations to the Validation Controller and reflects the results of these operations back to user. As the Validation API must provide the same functionality as the Validation UI, it may be feasible to use the Validation API as an intermediate layer for the operations between the Validation UI and the Validation Controller.
This component provides the machine accessible interface for running the tests provided by the framework and getting the validation reports. The provided functionality is the same as in the Validation UI, but the interface is intended to be used by remote validation applications.
The API must be provided as HTTP service points using JSON data structures.
Central component for maintaining and running the tests. Responsible of keeping track of the running tests and providing the command and query interfaces for the Validation API and UI components. Responsible also for coordinating authentication and authorization for the users and remote applications.
Uses the Test Catalog for maintaining and querying information about the known Specifications, Conformance Classes, Abstract Test Suites and avaiable Executable Test Suites.
Uses the Test Harness for providing an isolated, standardized sandbox environment for running and controlling the Executable Tests and getting their results.
Uses the Test Result Store for storing and querying the results of the executed test for future reference.
Provides a program interface for the ETS modules to register themselves to the Test Catalog. The Validation Controller must be able to handle loading more than one version of an ETS Module at the same time in a way that the different versions of the code do not interfere with each other. In practice for object oriented programming environments like Java, this means class loading isolation.
Contain program code for running an Executable Test Suites within the sandbox environment provided by the Test Harness. The code must not use any means of accessing the system resources or external resources than the ones provided by the public program interface of the Test Harness. This restriction provides the necessary code isolation for developing, testing and running the ETS Modules in different execution environments during their lifecycle. It also ensures the portability on the ETS code as it's guaranteed that the code will run in any enviroment implementing the Test Harness public program interface. ETS Module code must not require any direct interaction with software user. ETS Module code must not contain any process or thread management. It must provide a standardized way for the Test Harness to execute the test code for each Executable Test using the resources and execution environment provided by the Test Harness.
The ETS Modules register themselves to the Validation Controller which in turn registers the provided Executable Test Suites in the Test Catalog.
One ETS Module contains the ETS code for a single ETS version corresponding to a single ATS version.
ETS Module code may only programmatically depend on the Test Harness public program interface and the registration interface provided by the Validation Controller. ETS Module code may not depend on the code provided by other ETS Modules.
Keeps track of the known Specifications, Conformance Classes, Abstract Test Suites (including individual Abstract Tests) and available Executable Test Suites (including individual Executable Tests) and their relationships. Provides a program interface for querying the catalog. Stores the catalog data permanently and keeps the data model consistent.
Provides the necessary operations to the ETS Module code to query external resources based on URIs and run commonly used validation operations (like XML Schema validation). Consists of a public Test Harness program interface and it's implementation code. Takes commands from the Validation Controller to start executable tests for the given resources, provides it information on the currently executing tests, and notications on finishing the tests containing the test results provided by the ETS Module code.
The Test Harness should try to ensure a reasonable level of runtime code isolation for the code provided by the ETS Modules by trying to ensure that the ETS code cannot access the external resources or systems resources otherwise than using the Test Harness public interface functionality. The Test Harness must be able to manage and execute test code provided by more than one version of an ETS Module.
Test Harness controls the processes and/or threading used for running the tests and is responsible for queuing the tests for later execution if the current system resources do not allow all the requested tests to run at the same time.
This component is responsible for resolving the resource identifiers of the external resources required by the Executable Tests and retrieving them from the Internet as requested by the Test Harness. The Resource Resolver may keep cached copies of the retrieved resources for quick access. It must however fetch the resources from the original source if asked to do so by the Test Harness.
Test Result Store¶
Keeps the results of the executed tests stored permanently and provides a program interface for querying them.
MIWP-5 review questions¶
- Peter Parslow: I am concerned that by specifying the design in any detail, we exclude potential existing solutions, and therefore increase development time & cost. In particular, would the CITE engine fit this spec? And if not, why? Both 'what is different>' and 'why would we specify something that CITE doesn't need?'
- Ilkka Rinne: your point of over-specifying the system is of course valid Peter. However, according to my experience the very fact that the customers' do not give enough details of what they want to contract is a typical reason for failed software projects. Evaluating the OGC Team Engine (which is used for running the CITE tests) against these requirements would certainly be a good excercise.
- Sven Böhme: The GDI-DE Testsuite uses the OGC TEAM Engine with CTL and TestNG. But we just test against services and xml-files (metadata). No data schema tests are avaiable. I don´t know if CTL supports these test cases.
- 2. Peter Parslow: (potentially an instance of the above), why specify that the API communicates using JSON structures, when everything about which it communicates is standardised on XML? (the metadata records, the datasets, the GetCapabilities responses & most of the other service responses). This is not a complaint about JSON per se, but about insisting on JSON in this context.
- Ilkka Rinne: XML could certainly be used, but these days it's much easier to find existing tools and libraries for building modern web applications on top of the JSON APIs that XMl APIs.