Ok. /Stefan

> On 26 Aug 2016, at 16:34, Jana Mendrok <[email protected]> wrote:
> 
> Hi,
> 
> thanks for the input, Stefan & Patrick!
> 
> I favor Patrick's suggestion to make more rigid scat_data checks part of 
> cloudbox_checkedCalc (mainly because scat_data along with pnd_field and 
> cloudbox_limits defines the cloudbox or, better said, the interface to the 
> scatt solvers. havin some tests already done and not breaking controlfiles 
> are some nice additions.). if we really want to have a separate 
> scat_data_checkedCalc (or so), we can still separate that later anyways.
> 
> So, i'll add some code. With default Nan/neg and scamat norm checks, but 
> either with a "know-what-i'm-doing" switch-off option.
> 
> thanks & wishes,
> Jana
> 
> On Fri, Aug 26, 2016 at 3:30 PM, Patrick Eriksson 
> <[email protected]> wrote:
> Hi Stefan,
> 
> In general I agree to not put all checks in the same function. But this is 
> not a clear cut case. The size of scat_data shall be consistent with 
> pnd_field, that in its turn should be consistent with cloudbox_limits. So no 
> clear place where to separate scat_data and other cloudbox checks.
> 
> In fact cloudbox_checkCalc already contains some checks of scat_data. Which 
> makes sense for me. scat_data, pnd_field and cloudbox_limits are the main 
> "cloudbox variables".
> 
> Another reason that if we introduce scat_data_check, we must add it to a 
> number of ARTS WSMs. And modify many many cfiles. Just  personally probably 
> need modify about 50 scripts if scata_dataCheck becomes mandatory.
> 
> Bye,
> 
> Patrick
> 
> 
> 
> 
> 
> On 2016-08-26 13:38, Stefan Buehler wrote:
> Hi all,
> 
> I agree with Patrick that a mandatory check method is the best
> solution. :-)
> 
> But modernising scat_dataCheck and making it mandatory (via a
> _checked flag) seems cleaner to me than including it in the cloudbox
> check. Better not to entangle issues that can be regarded separately.
> 
> 
> These check functions seem to have evolved as a general strategy, I
> think they are quite intuitive and user friendly. More so if their
> scope is limited and clear, less if there is a single huge check
> function that does all kinds of things.
> 
> All the best,
> 
> Stefan
> 
> P.s.: Jana, I think this is really great work, and very timely.
> Oliver and I spent some time yesterday looking at your test case and
> debugging the scat data merging method. So, the comparison of the
> different solvers has already been fruitful to squash bugs.
> 
> On 26 Aug 2016, at 08:59, Patrick Eriksson
> <[email protected]> wrote:
> 
> Hi Jana,
> 
> I did not have scat_dataCheck active in my memory. I think this is
> a lesson that non-mandatory stuff will be forgotten. To avoid nasty
> errors and incorrect results, we should make most basic checks
> mandatory.
> 
> My suggestion is then to extend cloudbox_checkedCalc. My view on
> cloudbox_checkedCalc is that the tests we discuss are in fact
> inside the scope of the WSM. So just strange that they are not
> already there!
> 
> (If some tests turn out to be computationally demanding, then I
> prefer to have option flags of type "I know what I am doing", to
> deactive the checks.)
> 
> Regarding normalisation, how big difference is there between
> quadrature rules? 1%, 10% or 100%? Seems reasonable to at least
> check that normalisation is OK inside a factor of 2. (With an
> option to deactive this, if you use a solver anyhow checking
> this.)
> 
> Bye,
> 
> Patrick
> 
> 
> 
> On 2016-08-25 20:10, Jana Mendrok wrote:
> Hi,
> 
> i'm currently implementing an interface to the RT4 solver and am
> testing it. that was at least the original plan. partly it
> however turns out to be more of "fall into the traps" and
> "stumble upon issues" with other the solver (which i intended to
> use as reference)...
> 
> current issue i stumbled upon is that there seems to be no
> (sufficiently) rigid test on validity (or at least
> eligibility/adequacy/proper qualification) of the scattering
> data (scat_data).
> 
> i've created my scat_data from ARTS' TMatrix interface. Happened
> that one of the particles was too challenging for TMatrix and
> produced a couple of NaN and also negative extinction and
> absorption coefficients (K11 and alpha1). while NAN could be
> avoided (equ. to a TMatrix fail), it's hardly possible for the
> negatives (they are "regular" TMatrix output.
> 
> ARTS scatt solvers reacted very different on the presence of
> these invalid data: - RT4 gave a runtime error due to scatt
> matrix normalization being too far off (guess, Disort would do
> the same. wasn't tested here as I used oriented particles, which
> aren't handled by Disort). - DOIT ran through providing results
> that looked not immediately suspicious :-O - MC ran into an
> assertion within an interpolation.
> 
> that's quite unsatisfactory, i think, and should be handled in
> some consistent manner, i think. question is how. some ideas
> below. do you have some further ideas or suggestions or
> comments?
> 
> appreciate any input. wishes, Jana
> 
> 
> my thoughts/ideas:
> 
> - leave it to each solver to check for that (but then we need to
> go over them to do that)?
> 
> - make an additional check method and another check variable for
> the scat_data? there is already a scat_dataCheck WSM. which is
> rarely used. it e.g. checks that scat_data cover the required
> frequencies, but also the scatt matrix normalization, the latter
> only available for random orientation, though. in my experience
> it hasn't been too helpful (data coming from atmlab Mie interface
> - as well as from ARTS' Tmatrix interface as i learned these days
> - frequently don't pass the normalisation check. which is beside
> others due to the type of quadrature used. according to my
> experience, such norm issues are better handled by each solver
> separately), and since it's not mandatory, i avoid it. but it
> would be an option to modify this (make the norm check optional,
> instead check for nan and negative values) and make it mandatory
> (through a checked flag).
> 
> - an issue is, of course, that one does not really want to check
> data, that is frequently used, each time again (e.g. the
> arts-xml-data contents, data from the future ice/snow SSP
> database we are creating...) for those invalid entries. so maybe
> giving the data structure itself a flag and providing a WSM that
> does that checking and resets the flag? the above checked method
> could e.g. look for this flag and apply the whole check suite
> only on so far unchecked data.
> 
> 
> 
> --
> =====================================================================
> 
> 
> Jana Mendrok, Ph.D. (Project Assistent)
> Chalmers University of Technology Earth and Space Sciences SE-412
> 96 Gothenburg, Sweden
> 
> Phone : +46 (0)31 772 1883
> =====================================================================
> 
> 
> 
> 
> _______________________________________________
> arts_dev.mi mailing list [email protected]
> https://mailman.rrz.uni-hamburg.de/mailman/listinfo/arts_dev.mi
> 
> _______________________________________________ arts_dev.mi mailing
> list [email protected]
> https://mailman.rrz.uni-hamburg.de/mailman/listinfo/arts_dev.mi
> 
> _______________________________________________ arts_dev.mi mailing
> list [email protected]
> https://mailman.rrz.uni-hamburg.de/mailman/listinfo/arts_dev.mi
> 
> _______________________________________________
> arts_dev.mi mailing list
> [email protected]
> https://mailman.rrz.uni-hamburg.de/mailman/listinfo/arts_dev.mi
> 
> 
> 
> -- 
> =====================================================================
> Jana Mendrok, Ph.D. (Project Assistent)
> Chalmers University of Technology
> Earth and Space Sciences
> SE-412 96 Gothenburg, Sweden
> 
> Phone : +46 (0)31 772 1883
> =====================================================================

_______________________________________________
arts_dev.mi mailing list
[email protected]
https://mailman.rrz.uni-hamburg.de/mailman/listinfo/arts_dev.mi

Reply via email to