> >... but computing min & max on the fly can also be very expensive.
> >We have aggregated model output datasets where each variable is more
> >than 1TB!
> Sure, I can see that that's useful metadata about the dataset, and that
> there's value in caching it somewhere. I just don't think it belongs with
> the metadata inside the netcdf file. What's the use case for storing it
> there?
Dear all,
that may be an issue of "style", or more technically speaking the way you
set-up your system(s). I do think there is use for this as soon as you take a
file out of an interoperable context. However, it's a very good and valid point
to say that this information can (very) easily get corrupted. Thus it may be
good to define some way of marking "fragile" metadata (i.e. metadata that can
be corrupted by slicing or aggregating data from a file -- maybe even from
several files). In fact this is related to the issue of tracking metadata
information in the data model -- that has been brought up in the track ticket
but was referred to the implementation...
Cheers,
Martin
------------------------------------------------------------------------------------------------
------------------------------------------------------------------------------------------------
Forschungszentrum Juelich GmbH
52425 Juelich
Sitz der Gesellschaft: Juelich
Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
Vorsitzender des Aufsichtsrats: MinDir Dr. Karl Eugen Huthmacher
Geschaeftsfuehrung: Prof. Dr. Achim Bachem (Vorsitzender),
Karsten Beneke (stellv. Vorsitzender), Prof. Dr.-Ing. Harald Bolt,
Prof. Dr. Sebastian M. Schmidt
------------------------------------------------------------------------------------------------
------------------------------------------------------------------------------------------------
_______________________________________________
CF-metadata mailing list
[email protected]
http://mailman.cgd.ucar.edu/mailman/listinfo/cf-metadata