Discussion #2599

QoS: GetCoverage output size

Added by Jari Reini almost 5 years ago. Updated almost 5 years ago.

Status:Feedback
Priority:Normal
Assignee:-

Description

How big the output must be?

History

#1 Updated by Peter Baumann almost 5 years ago

Likely this can vary greatly: from 1-D timeseries extracted from 3D x/y/t image timeseries "datacubes", over single 2D x/y areas, to larger/multiple requests for people who want to generate maps themselves. Generally, the more processing on the server is possible, the smaller the data returned can be ("what you get is what you need"); one example is a band ratio computation, such as NDVI: get back just one layer instead of the full hyperspectral stack.

BTW, caveat: our experience was that after some short familiarization period people lose feeling for download sizes and start to request data freely, without feeling for data sizes to be transported. It might be useful for providers to limit download sizes and standardize a canonical error message for this.

 

#2 Updated by James Passmore almost 5 years ago

  • Status changed from New to Feedback

I'm not sure if the question here is related to a minimm size, a maximum size, or both.

We did discuss some time back that for some large coverages (and coverage collections) that trying to use a single GetCoverage request to get all a coverage would be impossible, so the provider may reasonably block such a request; so as far as a maximum size goes it will be completely dependent on what the service provider deems as reasonable.

As far as a minimum size goes we can I think request the value of a single pixel / voxel / ...  

 

#3 Updated by Mikko Visa almost 5 years ago

Currently the biggest data set we have is HIRLAM NWP. When requested with all parameters, levels (pressure), times it is several gigabytes. Also there's some grid points interpolated in between so it will be a regular grid. Still, we haven't seen the need to limit the size ie. one can download the whole model at once.

Just wondering what are the capabilities available in WCS to ask if need data is available... ? So people wouldn't download "just in case".

#4 Updated by Jukka Rahkonen almost 5 years ago

By the preliminary tests which I have done with MapServer and GeoServer it seems that those servers can handle distinct 1-2 gigabyte requests quite well. Both will probably have problems if request size is much bigger. Also the total throughput with GeoServer was degreasing when I made several concurrent requests (1.6 GB each) from 640 MB per minute to 160 MB per minute with one or four concurrent requests, respectively.  I am sure that hardware and  especially the amount of memory has very big influence so do not look at the absolut numbers.

The requirement "the download service shall  maintain  a  sustained  response  greater  than  0,5  Megabytes  per  second  or  greater  than  500 Spatial Objects per second" as it is defined in the Quality of service makes only 30 MB per minute and that is not a hard requirement if we use the same for WCS. What makes the servers to jam is big number of concurrent and big requests which eat all the available memory.

 

Also available in: Atom PDF