Re: [CF-metadata] Extension of Discrete Sampling Geometries for Simple Features

2017-02-22 Thread Chris Barker
;>
>> If you want to know how long a given row is, that is really easy with
>> (a), and almost as easy with (b) (involves two indexes and a subtraction)
>>
>>
>>
>>
>> However, if you want to extract a particular row: (b) makes this really
>> easy -- you simply access the slice of the array you want. with (a) you
>> need to loop through the entire "length_of_rows" array (up to the row of
>> interest) and add up the values to find the slice you need. not a huge
>> issue, but it is an issue. In fact, in my code to read ragged arrays in
>> netcdf, the first thing I do is pre-compute the index-to-each-row, so I can
>> then use that to access individual rows for future access -- if  you are
>> accessing via OpenDAP -- that's particular helpful.
>>
>>
>>
>>
>> So -- (b) is clearly (to me) the "best" way to do it -- but is it worth
>> introducing a second way to handle ragged arrays in CF? I would think yes,
>> but that would be offset if:
>>
>>
>>
>>
>>  - There is a bunch of existing library code that transparently handles
>> ragged arrays in netcdf (does netcdfJava have something? I'm pretty sure
>> Python doesn't -- certainly not in netCDF4)
>>
>>
>>
>>
>>  - That that existing lib code would be advantageous to leverage for code
>> reading features: I suspect that there will have to be enough custom code
>> that the ragged array bits are going to be the least of it.
>>
>>
>>
>>
>> So I'm for the "new" way of representing ragged arrays
>>
>>
>>
>>
>> -CHB
>>
>>
>>
>>
>>
>>
>> On Fri, Feb 3, 2017 at 11:41 AM, Bob Simons - NOAA Federal <
>> bob.sim...@noaa.gov> wrote:
>>
>> Then, isn't this proposal just the first step in the creation of a new
>> model and a new encoding of Simple Features, one that is "align[ed] ...
>> with as many other encoding standards in this space as is practical"? In
>> other words, yet another standard for Simple Features?
>>
>>
>>
>>
>> If so, it seems risky to me to take just the first (easy?) step "to
>> support the use cases that have a compelling need today" and not solve the
>> entire problem. I know the CF way is to just solve real, current needs, but
>> in this case it seems to risk a head slap moment in the future when we
>> realize that, in order to deal with some new simple feature variant, we
>> should have done things differently from the beginning?
>>
>>
>>
>>
>> And it seems odd to reject existing standards that have been so
>> painstakingly hammered out, in favor of starting the process all over
>> again.  We follow existing standards for other things (e.g., IEEE-754 for
>> representing floating point numbers in binary files), why can't we follow
>> an existing Simple Features standard?
>>
>>
>>
>>
>> ---
>>
>> Rather than just be a naysayer, let me suggest a very different
>> alternative:
>>
>>
>>
>>
>> There are several projects in the CF realm (e.g., this Simple Features
>> project, Discrete Sampling Geometry (DSG), true variable-length Strings,
>> ugrid(?)) which share a common underlying problem: how to deal with
>> variable-length multidimensional arrays: a[b][c], where the length of the c
>> dimension may be different for different b indices.
>>
>> DSG solved this (5 different ways!), but only for DSG.
>>
>> The Simple Features proposal seeks to solve the problem for Simple
>> Features.
>>
>> We still have no support for Unicode variable-length Strings.
>>
>>
>>
>>
>> Instead of continuing to solve the variable-length problem a different
>> way every time we confront it, shouldn't we solve it once, with one small
>> addition to the standard, and then use that solution repeatedly?
>>
>> The solution could be a simple variant of one of the DSG solutions, but
>> generalized so that it could be used in different situations.
>>
>> An encoding standard and built-in support for variable-length data arrays
>> in netcdf-java/c would solve a lot of problems, now and in the future.
>>
>> Some work on this is already done: I think the netcdf-java API already
>> supports variable-length arrays when reading netcdf-4 files.
>>
>> For Simple Features, the problem would reduce to: store the feature
>> (using some specified existing standard like WKT or WKB) in a
>> variable-length array.
>>
>>
>>

Re: [CF-metadata] Extension of Discrete Sampling Geometries for Simple Features

2017-02-18 Thread Blodgett, David
 Let's try this again. Sorry to spam everyone.

http://doodle.com/poll/yaherucx2w3cd9y6

On February 17, 2017 8:20:11 PM CST, "Whiteaker, Timothy L" <
whitea...@utexas.edu> wrote:
>
> Hi Dave,
>
> I get "This poll does not exist anymore" for both polls you sent.
>
>
>
> Tim Whiteaker
>
> Research Scientist
>
> The University of Texas at Austin
>
>
>
> *From:* CF-metadata [mailto:cf-metadata-boun...@cgd.ucar.edu] *On Behalf
> Of *David Blodgett
> *Sent:* Friday, February 17, 2017 2:08 PM
> *To:* CF Metadata
> *Subject:* Re: [CF-metadata] Extension of Discrete Sampling Geometries
> for Simple Features
>
>
>
> My apologies, I forgot to turn on time zone support in the poll below.
> Please use this one instead. http://doodle.com/poll/eikarnt35tdm7igd
>
>
>
> On Feb 17, 2017, at 1:22 PM, David Blodgett <dblodg...@usgs.gov> wrote:
>
>
>
> All,
>
>
>
> I haven’t heard much follow up, but here’s a doodle to coordinate a phone
> conversation about this. I think we have west-coast US participants and EU
> participants, so I chose times mid to late morning for me (midwest US).
>
>
>
> http://doodle.com/poll/eikarnt35tdm7igd
>
>
>
> Will make a call once a few people have expressed interest and we have a
> clear day/time.
>
>
>
> Regards,
>
>
>
> - Dave
>
>
>
> On Feb 6, 2017, at 11:29 AM, David Blodgett <dblodg...@usgs.gov> wrote:
>
>
>
> Dear CF,
>
>
>
> I want to follow up on the conversation here with an alternative approach
> suggested off list primarily between Jonathan and I. For this, I’m going to
> focus on use cases satisfied and simplification of the proposal allowed by
> not supporting those use cases. The changes below are largely driven by a
> desire to better align this proposal with the technical details of the
> prior art that is CF.
>
>
>
> If we:
>
> 1) don’t support node sharing, we can remove the complication of node -
> coordinate indexing / indirection, simplifying the proposal pretty
> significantly.
>
> 2) don’t use “break values” to indicate the separation between multi-part
> geometries and polygon holes, we end up with a data model with an extra
> dimension, but the NetCDF dimensions align with the natural dimensions of
> the data.
>
> 3) use “count” instead of a “start pointer” approach, we are better
> aligned with the existing DSG contiguous ragged array approach.
>
>
>
> Coming back to the three directions we could take this proposal from my
> cover letter on February 2nd.
>
> 1. Direct use of Well-Known Text (WKT). In this approach, well known
> text strings would be encoded using character arrays following a contiguous
> ragged array approach to index the character array by geometry (or instance
> in DSG parlance).
>
> 2. Implement the WKT approach using a NetCDF binary array. In this
> approach, well known text separators (brackets, commas and spaces) for
> multipoint, multiline, multipolygon, and polygon holes, would be encoded as
> break type separator values like -1 for multiparts and -2 for holes.
>
> 3. Implement the fundamental dimensions of geometry data in NetCDF.
> In this approach, additional dimensions and variables along those
> dimensions would be introduced to represent geometries, geometry parts,
> geometry nodes, and unique (potentially shared) coordinate locations for
> nodes to reference.
>
> The alternative I’m outlining here moves in the direction of 3. We had
> originally discounted it because it becomes very verbose and seems overly
> complicated if support for coordinate sharing is a requirement. If the
> three simplifications described above are used, then the third approach
> seems more tenable.
>
>
>
> Jonathan has also suggested that: (these are in reaction to the CDL in my
> letter from February 2nd)
>
> 1) Rename geom_coordinates as node_coordinates, for consistency with UGRID.
>
> 2) Omit node_dimension. This is redundant, since the dimension can be
> found by
>
> examining the node coordinate variables.
>
> 3) Prescribe numerous “codes” and assumptions in the specification instead
> of letting them be described with attribute values.
>
> 4) It would be more consistent with CF and UGRID to use a single container
> variable to hang all the topology/geometry information from.
>
>
>
> Which I, personally, am happy to accept if others don’t object.
>
>
>
> A couple other suggestions from Jonathan I want to discuss a bit more:
>
> 1) Rename geometry as topology and geom_type as topology_type.
>
> While I’d be open to something other than geom, topology is
> od

Re: [CF-metadata] Extension of Discrete Sampling Geometries for Simple Features

2017-02-17 Thread David Blodgett
f:
>>>> 
>>>>  - There is a bunch of existing library code that transparently handles 
>>>> ragged arrays in netcdf (does netcdfJava have something? I'm pretty sure 
>>>> Python doesn't -- certainly not in netCDF4) 
>>>> 
>>>>  - That that existing lib code would be advantageous to leverage for code 
>>>> reading features: I suspect that there will have to be enough custom code 
>>>> that the ragged array bits are going to be the least of it.
>>>> 
>>>> So I'm for the "new" way of representing ragged arrays
>>>> 
>>>> -CHB
>>>> 
>>>> 
>>>> On Fri, Feb 3, 2017 at 11:41 AM, Bob Simons - NOAA Federal 
>>>> <bob.sim...@noaa.gov <mailto:bob.sim...@noaa.gov>> wrote:
>>>> Then, isn't this proposal just the first step in the creation of a new 
>>>> model and a new encoding of Simple Features, one that is "align[ed] ... 
>>>> with as many other encoding standards in this space as is practical"? In 
>>>> other words, yet another standard for Simple Features?
>>>> 
>>>> If so, it seems risky to me to take just the first (easy?) step "to 
>>>> support the use cases that have a compelling need today" and not solve the 
>>>> entire problem. I know the CF way is to just solve real, current needs, 
>>>> but in this case it seems to risk a head slap moment in the future when we 
>>>> realize that, in order to deal with some new simple feature variant, we 
>>>> should have done things differently from the beginning?
>>>> 
>>>> And it seems odd to reject existing standards that have been so 
>>>> painstakingly hammered out, in favor of starting the process all over 
>>>> again.  We follow existing standards for other things (e.g., IEEE-754 for 
>>>> representing floating point numbers in binary files), why can't we follow 
>>>> an existing Simple Features standard?
>>>> 
>>>> ---
>>>> Rather than just be a naysayer, let me suggest a very different 
>>>> alternative:
>>>> 
>>>> There are several projects in the CF realm (e.g., this Simple Features 
>>>> project, Discrete Sampling Geometry (DSG), true variable-length Strings, 
>>>> ugrid(?)) which share a common underlying problem: how to deal with 
>>>> variable-length multidimensional arrays: a[b][c], where the length of the 
>>>> c dimension may be different for different b indices.
>>>> DSG solved this (5 different ways!), but only for DSG.
>>>> The Simple Features proposal seeks to solve the problem for Simple 
>>>> Features.
>>>> We still have no support for Unicode variable-length Strings.
>>>> 
>>>> Instead of continuing to solve the variable-length problem a different way 
>>>> every time we confront it, shouldn't we solve it once, with one small 
>>>> addition to the standard, and then use that solution repeatedly?
>>>> The solution could be a simple variant of one of the DSG solutions, but 
>>>> generalized so that it could be used in different situations.
>>>> An encoding standard and built-in support for variable-length data arrays 
>>>> in netcdf-java/c would solve a lot of problems, now and in the future.
>>>> Some work on this is already done: I think the netcdf-java API already 
>>>> supports variable-length arrays when reading netcdf-4 files.
>>>> For Simple Features, the problem would reduce to: store the feature (using 
>>>> some specified existing standard like WKT or WKB) in a variable-length 
>>>> array. 
>>>> 
>>>> 
>>>> 
>>>> 
>>>> 
>>>> On Fri, Feb 3, 2017 at 9:07 AM, <cf-metadata-requ...@cgd.ucar.edu 
>>>> <mailto:cf-metadata-requ...@cgd.ucar.edu>> wrote:
>>>> Date: Fri, 3 Feb 2017 11:07:00 -0600
>>>> From: David Blodgett <dblodg...@usgs.gov <mailto:dblodg...@usgs.gov>>
>>>> To: Bob Simons - NOAA Federal <bob.sim...@noaa.gov 
>>>> <mailto:bob.sim...@noaa.gov>>
>>>> Cc: CF Metadata <cf-metadata@cgd.ucar.edu 
>>>> <mailto:cf-metadata@cgd.ucar.edu>>
>>>> Subject: Re: [CF-metadata] Extension of Discrete Sampling Geometries
>>>> for Simple Features
>>>> Message-ID: <8ee85e65-2815-4720-90fc-13c72d3c7...@usgs.gov 
>>>> <mailto:8ee85e65-2815-4720-90fc-13c72

Re: [CF-metadata] Extension of Discrete Sampling Geometries for Simple Features

2017-02-06 Thread David Blodgett
 already uses (a). However, working with it, I'm pretty convinced that 
>> it's the "wrong" choice:
>> 
>> If you want to know how long a given row is, that is really easy with (a), 
>> and almost as easy with (b) (involves two indexes and a subtraction)
>> 
>> However, if you want to extract a particular row: (b) makes this really easy 
>> -- you simply access the slice of the array you want. with (a) you need to 
>> loop through the entire "length_of_rows" array (up to the row of interest) 
>> and add up the values to find the slice you need. not a huge issue, but it 
>> is an issue. In fact, in my code to read ragged arrays in netcdf, the first 
>> thing I do is pre-compute the index-to-each-row, so I can then use that to 
>> access individual rows for future access -- if  you are accessing via 
>> OpenDAP -- that's particular helpful.
>> 
>> So -- (b) is clearly (to me) the "best" way to do it -- but is it worth 
>> introducing a second way to handle ragged arrays in CF? I would think yes, 
>> but that would be offset if:
>> 
>>  - There is a bunch of existing library code that transparently handles 
>> ragged arrays in netcdf (does netcdfJava have something? I'm pretty sure 
>> Python doesn't -- certainly not in netCDF4) 
>> 
>>  - That that existing lib code would be advantageous to leverage for code 
>> reading features: I suspect that there will have to be enough custom code 
>> that the ragged array bits are going to be the least of it.
>> 
>> So I'm for the "new" way of representing ragged arrays
>> 
>> -CHB
>> 
>> 
>> On Fri, Feb 3, 2017 at 11:41 AM, Bob Simons - NOAA Federal 
>> <bob.sim...@noaa.gov <mailto:bob.sim...@noaa.gov>> wrote:
>> Then, isn't this proposal just the first step in the creation of a new model 
>> and a new encoding of Simple Features, one that is "align[ed] ... with as 
>> many other encoding standards in this space as is practical"? In other 
>> words, yet another standard for Simple Features?
>> 
>> If so, it seems risky to me to take just the first (easy?) step "to support 
>> the use cases that have a compelling need today" and not solve the entire 
>> problem. I know the CF way is to just solve real, current needs, but in this 
>> case it seems to risk a head slap moment in the future when we realize that, 
>> in order to deal with some new simple feature variant, we should have done 
>> things differently from the beginning?
>> 
>> And it seems odd to reject existing standards that have been so 
>> painstakingly hammered out, in favor of starting the process all over again. 
>>  We follow existing standards for other things (e.g., IEEE-754 for 
>> representing floating point numbers in binary files), why can't we follow an 
>> existing Simple Features standard?
>> 
>> ---
>> Rather than just be a naysayer, let me suggest a very different alternative:
>> 
>> There are several projects in the CF realm (e.g., this Simple Features 
>> project, Discrete Sampling Geometry (DSG), true variable-length Strings, 
>> ugrid(?)) which share a common underlying problem: how to deal with 
>> variable-length multidimensional arrays: a[b][c], where the length of the c 
>> dimension may be different for different b indices.
>> DSG solved this (5 different ways!), but only for DSG.
>> The Simple Features proposal seeks to solve the problem for Simple Features.
>> We still have no support for Unicode variable-length Strings.
>> 
>> Instead of continuing to solve the variable-length problem a different way 
>> every time we confront it, shouldn't we solve it once, with one small 
>> addition to the standard, and then use that solution repeatedly?
>> The solution could be a simple variant of one of the DSG solutions, but 
>> generalized so that it could be used in different situations.
>> An encoding standard and built-in support for variable-length data arrays in 
>> netcdf-java/c would solve a lot of problems, now and in the future.
>> Some work on this is already done: I think the netcdf-java API already 
>> supports variable-length arrays when reading netcdf-4 files.
>> For Simple Features, the problem would reduce to: store the feature (using 
>> some specified existing standard like WKT or WKB) in a variable-length 
>> array. 
>> 
>> 
>> 
>> 
>> 
>> On Fri, Feb 3, 2017 at 9:07 AM, <cf-metadata-requ...@cgd.ucar.edu 
>> <mailto:cf-metadata-requ...@cgd.ucar.edu>> wrote:
>> Date: Fri, 3 Feb 2017 11:07:00 -0600
>>

Re: [CF-metadata] Extension of Discrete Sampling Geometries for Simple Features

2017-02-03 Thread David Blodgett
Dear Bob,

I’ll just take these in line.

1) noted. We have been trying to figure out what to do with the point 
featureType and I think leaving it more or less alone is a viable path forward. 

2) This is not an exact replica of WKT, but rather a similar approach to WKT. 
As I stated, we have followed the ISO simple features data model and well known 
text feature types in concept, but have not used the same standardization 
formalisms. We aren’t advocating for supporting “all of” any standard but are 
rather attempting to support the use cases that have a compelling need today 
while aligning this with as many other encoding standards in this space as is 
practical. Hopefully that answers your question, sorry if it’s vague.

3) The google doc linked in my response contains the encoding we are proposing 
as a starting point for conversation: http://goo.gl/Kq9ASq 
 I want to stress, as a starting point for discussion. I 
expect that this proposal will change drastically before we’re done.

4) Absolutely envision tools doing what you say, convert to/from standard 
spatial formats and NetCDF-CF geometries. We intend to introduce an R and a 
Python implementation that does exactly as you say along with whatever form 
this standard takes in the end. R and Python were chosen as the team that 
brought this together are familiar with those two languages, additional 
implementations would be more than welcome.

5) We do include a “geometry” featureType similar to the “point” featureType. 
Thus our difficulty with what to do with the “point” featureType. You are 
correct, there are lots of non timeSeries applications to be solved and this 
proposal does intend to support them (within the existing DSG constructs).

Thanks for your questions, hopefully my answers close some gaps for you.

- Dave

> On Feb 3, 2017, at 10:47 AM, Bob Simons - NOAA Federal  
> wrote:
> 
> 1) There is a vague comment in the proposal about possibly changing the point 
> featureType. Please don't, unless the changes don't affect current uses of 
> Point. There are already 1000's of files that use it. If this new system 
> offers an alternative, then fine, it's an alternative. One of the most 
> important and useful features of a good standard is backwards compatibility. 
> 
> 2) You advocate "Implement the WKT approach using a NetCDF binary array." Is 
> this system then an exact encoding of WKT, neither a subset nor a superset?  
> "Simple Features" are often not simple. 
> If it is WKT (or something else), what is the standard you are following to 
> describe the Simple Features (e.g.,  ISO/IEC 13249-3:2016 and ISO 19162:2015)?
> Does your proposal deviate in any way from the standard's capabilities?
> Do you advocate following the entire WKT standard, e.g., supporting all the 
> feature types that WKT supports?
> 
> 3) Since you are not using the WKT encoding, but creating your own, where is 
> the definition of the encoding system you are using? 
> 
> 4) This is a little out of CF scope, but:
> Do you envision tools, notably, netcdf-c/java, having a writer function that 
> takes in WKT and encodes the information in a file, and having a reader 
> function that reads the file and returns WKT? Or is it your plan that the 
> encoding/ decoding is left to the user?  
> 
> 5) This proposal is for "Simple Features plus Time Series" (my phrase not 
> yours). But aren't there lots of other uses of Simple Features? Will there be 
> other proposals in the future for "Simple Features plus X" and "Simple 
> Features plus Y"? If so, will CF eventually become a massive document where 
> Simple Features are defined over and over again, but in different contexts? 
> If so, wouldn't a better solution be to deal with Simple Features separately 
> (as Postgres does by making a geometric data type?), and then add "Simple 
> Features plus Time Series" as the first use of it?
> 
> Thanks for answering these questions.
> Please forgive me if I missed parts of your proposal that answer these 
> questions. 
> 
> 
> On Thu, Feb 2, 2017 at 5:57 AM,  > wrote:
> Date: Thu, 2 Feb 2017 07:57:36 -0600
> From: David Blodgett >
> To: >
> Subject: [CF-metadata] Extension of Discrete Sampling Geometries for
> Simple  Features
> Message-ID: <224c2828-7212-449f-8c2c-97d903f6b...@usgs.gov 
> >
> Content-Type: text/plain; charset="utf-8"
> 
> Dear CF Community,
> 
> We are pleased to submit this proposal for your consideration and review. The 
> cover letter we've prepared below provides some background and explanation 
> for the proposed approach. The google doc here  > is an excerpt of the CF specification with track 
> changes turned on. 

Re: [CF-metadata] Extension of Discrete Sampling Geometries for Simple Features

2017-02-03 Thread Bob Simons - NOAA Federal
1) There is a vague comment in the proposal about possibly changing the
point featureType. Please don't, unless the changes don't affect current
uses of Point. There are already 1000's of files that use it. If this new
system offers an alternative, then fine, it's an alternative. One of the
most important and useful features of a good standard is backwards
compatibility.

2) You advocate "Implement the WKT approach using a NetCDF binary array."
Is this system then an exact encoding of WKT, neither a subset nor a
superset?  "Simple Features" are often not simple.
If it is WKT (or something else), what is the standard you are following to
describe the Simple Features (e.g.,  ISO/IEC 13249-3:2016 and ISO
19162:2015)?
Does your proposal deviate in any way from the standard's capabilities?
Do you advocate following the entire WKT standard, e.g., supporting all the
feature types that WKT supports?

3) Since you are not using the WKT encoding, but creating your own, where
is the definition of the encoding system you are using?

4) This is a little out of CF scope, but:
Do you envision tools, notably, netcdf-c/java, having a writer function
that takes in WKT and encodes the information in a file, and having a
reader function that reads the file and returns WKT? Or is it your plan
that the encoding/ decoding is left to the user?

5) This proposal is for "Simple Features plus Time Series" (my phrase not
yours). But aren't there lots of other uses of Simple Features? Will there
be other proposals in the future for "Simple Features plus X" and "Simple
Features plus Y"? If so, will CF eventually become a massive document where
Simple Features are defined over and over again, but in different contexts?
If so, wouldn't a better solution be to deal with Simple Features
separately (as Postgres does by making a geometric data type?), and then
add "Simple Features plus Time Series" as the first use of it?

Thanks for answering these questions.
Please forgive me if I missed parts of your proposal that answer these
questions.


On Thu, Feb 2, 2017 at 5:57 AM,  wrote:

> Date: Thu, 2 Feb 2017 07:57:36 -0600
> From: David Blodgett 
> To: 
> Subject: [CF-metadata] Extension of Discrete Sampling Geometries for
> Simple  Features
> Message-ID: <224c2828-7212-449f-8c2c-97d903f6b...@usgs.gov>
> Content-Type: text/plain; charset="utf-8"
>
> Dear CF Community,
>
> We are pleased to submit this proposal for your consideration and review.
> The cover letter we've prepared below provides some background and
> explanation for the proposed approach. The google doc here <
> http://goo.gl/Kq9ASq> is an excerpt of the CF specification with track
> changes turned on. Permissions for the document allow any google user to
> comment, so feel free to comment and ask questions in line.
>
> Note that I?m sharing this with you with one issue unresolved. What to do
> with the point featureType? Our draft suggests that it is part of a new
> geometry featureType, but it could be that we leave it alone and introduce
> a geometry featureType. This may be a minor point of discussion, but we
> need to be clear that this is an issue that still needs to be resolved in
> the proposal.
>
> Thank you for your time and consideration.
>
> Best Regards,
>
> David Blodgett, Tim Whiteaker, and Ben Koziol
>
> Proposed Extension to NetCDF-CF for Simple Geometries
>
> Preface
>
> The proposed addition to NetCDF-CF introduced below is inspired by a
> pre-existing data model governed by OGC and ISO as ISO 19125-1. More
> information on Simple Features may be found here. <
> https://en.wikipedia.org/wiki/Simple_Features> To the knowledge of the
> authors, it is consistent with ISO 19125-1 but has not been specified using
> the formalisms of OGC or ISO. Language used attempts to hold true to
> NetCDF-CF semantics while not conflicting with the existing standards
> baseline. While this proposal does not support the entire scope of the the
> simple features ecosystem, it does support the core data types in most
> common use around the community.
>
> The other existing standard to mention is UGRID convention <
> http://ugrid-conventions.github.io/ugrid-conventions/>. The authors have
> experience reading and writing UGRID and have designed the proposed
> structure in a way that is inspired by and consistent with it.
>
> Terms and Definitions
>
> (Taken from OGC 06-103r4 OpenGIS Implementation Specification for
> Geographic information - Simple feature access - Part 1: Common
> architecture .)
>
> Feature: Abstraction of real world phenomena - typically a geospatial
> abstraction with associated descriptive attributes.
> Simple Feature: A feature with all geometric attributes described
> piecewise by straight line or planar interpolation between point sets.
> Geometry (geometric complex): A set of disjoint geometric primitives - one
> or more