Benchmarking of GIS software on OGC Services and Large Datasets

Idea#103

Stage: Active

Campaign: Data Availability, Information Quality, Accountability...

It would be useful to have regular benchmarking exercises performed on both commercial and open source GIS software on both the Windows and Linux 32-bit and 64-bit platforms. Such benchmarks would include serving up data via OGC web Services, http://www.opengeospatial.org/standards, such as WMS, WFS, WCS, SOS, etc. This exercise could be based on the techniques used at the 2009 FOSSG WMS Shootout.

Also, desktop GIS systems could be benchmarked on the processing of large data sets like a 400GB ASCII Lidar dataset, or merging and reprojecting 1 m NAIP data for an entire state.

The results of these benchmarks could be made publicly available, or if restrictions are placed on the publication of benchmark results by the software license, shared across the Department of Interior.

Tags

Submitted by

Feedback Score

6 votes
Voting Disabled

Idea Details

Vote Activity

  1. Upvoted
  2. Downvoted
  3. Upvoted
  4. Upvoted
  5. Upvoted
  6. Upvoted
  7. Upvoted
  8. Upvoted

Comments

  1. Comment
    Linda

    I'm starting to sound like a broken record (yes, I'm part of the "vinyl" generation). I really want to consider your IT ideas, but first I need to understand the terms in "old foggy speak". So far my IT questions have been answered with more IT language, and I'm frustrated.

  2. Comment
    Linda

    Okay, I'm a "fogy" who's "foggy", and still confused!

  3. Comment
    doug_newcomb ( Idea Submitter )

    Linda,

    All I'm proposing is that we test:

    1) How fast the different types of GIS server software can perform the same task (serving up data via open standards that DOI has already endorsed in the 2007 Geospatial Blueprint )

    2) How well the software can digest large geospatial data sets ( Think landscape analysis over a multi-state scale, can it be processed as one dataset, or does the dataset need to be broken up into smaller bits to be processed. How fast can the task be accomplished?)

    3) Make the results available, at least to the bureaus, if not publicly.

    An analogy would be the same function that Consumer Reports performs for commodity products from toothpaste to automobiles. The tests in this case would be for Geospatial software.

    You cannot know what the best tool for a job is unless you test the tools that are available for the job. If you only know about 1 tool, your science and/or productivity is going to be limited by the limits of that 1 tool. In other words, if all you have is a hammer, everything looks like a nail.

    I'm from the "vinyl" generation as well :-)

  4. Comment
    Linda

    Got it. Good to knoow "My Generation" hasn't faded away!

  5. Comment
    doug_newcomb ( Idea Submitter )

    This is a comment passed along to me from the osgeo discussions mailing list from Arnulf Christl:

    ---------------

    Doug,

    thank you for this information.

    One minor clarification on terminology (I will never tire): Using the

    wording "commercial and open source" to differentiate proprietary form

    free/open license models is misguiding as all Open Source software can

    also be used in commercial contexts and is thus also "commercial

    software".

    This has recently been clarified by the US Department of Defense available

    in a document [1], attachment 2 on page 5, §2 a):

    "In almost all cases, OSS meets the definition of “commercial computer

    software” and shall be given appropriate statutory preference in

    accordance with 10 USC 2377 (reference (b)) (see also FAR 2.101(b),

    12.000, 12.101 (reference (c)); and DFARS 212.212, and 252.227-7014(a)(1)

    (reference (d)))."

    (I love to cite those guys, they manage to make everything look dead

    serious :-)

    The correct term to differentiate free and open source license models from

    proprietary license is models is "proprietary", and nothing but.

    Best regards,

    Arnulf.

    [1] http://cio-nii.defense.gov/sites/oss/2009OSS.pdf

    ----------------------------------