Re: [arts-dev] About error checking

2019-03-25 Thread Patrick Eriksson

Hi Richard,

I can agree  on that this is not always critical for efficiency as long 
as the check is a simple comparison. But some checks are much more 
demanding. For example, the altitudes in z_field should be strictly 
increasing. If you have a large 3D atmosphere, it will be very costly to 
repeat this check for every single ppath calculation. And should this be 
checked also in other places where z_field is used? For example, if you 
use iyIndependentBeamApproximation you will repeat the check as also the 
DISORT and RT4 methods should check this, as they can be called without 
providing a ppath.


Further, I don't follow what strategy you propose. The discussion around 
planck indicated that you wanted the checks as far down as possible. But 
the last email seems to indicate that you also want checks higher up, 
e.g. before entering interpolation. I assume we don't want checks on 
every level. So we need to be clear about at what level the checks shall 
be placed. If not, everybody will be lazy and hope that a check 
somewhere else catches the problem.


In any case, it should be easier to provide informative error messages 
if problems are identified early on. That is, easier to pinpoint the 
reason to the problem.


Bye,

Patrick



On 2019-03-25 12:24, Richard Larsson wrote:

Hi Patrick,

Just some quick points.

Den sön 24 mars 2019 kl 10:29 skrev Patrick Eriksson 
mailto:patrick.eriks...@chalmers.se>>:


Hi Richard,

A great initiative. How errors are thrown can for sure be improved. We
are both lacking such checks (still to many cases where an assert shows
up instead on a proper error message), and they errors are probably
implemented inconsistently.

When it comes to use try/catch, I leave the discussion to others.


But I must bring up another aspect here, on what level to apply asserts
and errors. My view is that we have decided that such basic
functions as
planck should only contain asserts. For efficiency reasons.


Two things.

First, Oliver tested the speeds here.  The results are random in 
physics_funcs.cc:


number_density (100 million calls, averaged over 5 runs):

with assert:    0.484s
with try/catch: 0.502s, 3.8% slower than assert
no checks:      0.437s, 9.8% faster than assert

dinvplanckdI (20 million calls, averaged over 5 runs):

with assert:    0.576s
with try/catch: 0.563s, 2.3% faster than assert
no checks:      0.561s, 2.7% faster than assert

but with no notable differences.  (We are not spending any of our time 
in these functions really, so +-10% is nothing.)  One thing that asserts 
do that are nicer that they are completely gone when NDEBUG is set.  We 
might therefore want to wrap the deeper function-calls in something that 
removes these errors from the compilers view.  We have the 
DEBUG_ONLY-environments for that, but a negative temperature is not a 
debug-thing.  I suggested to Oliver we introduce a flag that allows us 
to remove some parts or all parts of the error-checking code on the 
behest of the user.  I do not know what to name said flag so the code is 
readable.  "IGNORABLE()" in ARTS and "-DIGNORE_ERRORS=1" in cmake to set 
the flag that everything in the previous parenthesis is not passed to 
the compiler?  This could be used to generate your 'faster' code but 
errors would just be completely ignored; of course, users would have to 
be warned that any OS error or memory error could still follow...


The second point I have is that I really do not see the points of the 
asserts at all.  Had they allowed the compiler to make guesses, that 
would be somewhat nice.  But in practice, they just barely indicate what 
the issues are by comparing some numbers or indices before terminating a 
program.  They don't offer any solutions, and they should really never 
ever occur.  I would simply ban them from use in ARTS, switch to throws, 
and allow the user to tell the compiler to allow building a properly 
non-debug-able version of ARTS where all errors are ignored as above.



For a pure forward model run, a negative frequency or temperature would
come from f_grid and t_field, respectively. We decided to introduce
special check methods, such as atmfields_checkedCalc, to e.g. catch
negative temperatures in input.


I think it would be better if we simply removed the *_checkedCalc 
functions entirely (as a demand for executing code; they are still good 
for sanity checkups).  I think they mess up the logic of many things.  
Agendas that work use these outputs when they don't need them, and the 
methods have to manually check the input anyways because you cannot 
allow segfaults.  It is not the agendas that need these checks.  It is 
the methods calling these agendas.  And they only need to checks for 
ensuring they have understood what they want to do.  And even if the 
checked value is positive when you reach a function, you cannot say in 
that method if the check was for the data you have now, for data 

Re: [arts-dev] About error checking

2019-03-25 Thread Richard Larsson
Hi Patrick,

Just some quick points.

Den sön 24 mars 2019 kl 10:29 skrev Patrick Eriksson <
patrick.eriks...@chalmers.se>:

> Hi Richard,
>
> A great initiative. How errors are thrown can for sure be improved. We
> are both lacking such checks (still to many cases where an assert shows
> up instead on a proper error message), and they errors are probably
> implemented inconsistently.
>
> When it comes to use try/catch, I leave the discussion to others.
>
>
> But I must bring up another aspect here, on what level to apply asserts
> and errors. My view is that we have decided that such basic functions as
> planck should only contain asserts. For efficiency reasons.
>

Two things.

First, Oliver tested the speeds here.  The results are random in
physics_funcs.cc:

number_density (100 million calls, averaged over 5 runs):

with assert:0.484s
with try/catch: 0.502s, 3.8% slower than assert
no checks:  0.437s, 9.8% faster than assert

dinvplanckdI (20 million calls, averaged over 5 runs):

with assert:0.576s
with try/catch: 0.563s, 2.3% faster than assert
no checks:  0.561s, 2.7% faster than assert

but with no notable differences.  (We are not spending any of our time in
these functions really, so +-10% is nothing.)  One thing that asserts do
that are nicer that they are completely gone when NDEBUG is set.  We might
therefore want to wrap the deeper function-calls in something that removes
these errors from the compilers view.  We have the DEBUG_ONLY-environments
for that, but a negative temperature is not a debug-thing.  I suggested to
Oliver we introduce a flag that allows us to remove some parts or all parts
of the error-checking code on the behest of the user.  I do not know what
to name said flag so the code is readable.  "IGNORABLE()" in ARTS and
"-DIGNORE_ERRORS=1" in cmake to set the flag that everything in the
previous parenthesis is not passed to the compiler?  This could be used to
generate your 'faster' code but errors would just be completely ignored; of
course, users would have to be warned that any OS error or memory error
could still follow...

The second point I have is that I really do not see the points of the
asserts at all.  Had they allowed the compiler to make guesses, that would
be somewhat nice.  But in practice, they just barely indicate what the
issues are by comparing some numbers or indices before terminating a
program.  They don't offer any solutions, and they should really never ever
occur.  I would simply ban them from use in ARTS, switch to throws, and
allow the user to tell the compiler to allow building a properly
non-debug-able version of ARTS where all errors are ignored as above.


>
> For a pure forward model run, a negative frequency or temperature would
> come from f_grid and t_field, respectively. We decided to introduce
> special check methods, such as atmfields_checkedCalc, to e.g. catch
> negative temperatures in input.
>

I think it would be better if we simply removed the *_checkedCalc functions
entirely (as a demand for executing code; they are still good for sanity
checkups).  I think they mess up the logic of many things.  Agendas that
work use these outputs when they don't need them, and the methods have to
manually check the input anyways because you cannot allow segfaults.  It is
not the agendas that need these checks.  It is the methods calling these
agendas.  And they only need to checks for ensuring they have understood
what they want to do.  And even if the checked value is positive when you
reach a function, you cannot say in that method if the check was for the
data you have now, for data sometime long ago in the program flow, or just
a bit someone set before calling the code.  So all the inputs have to be
checked anyways.

For the atmfields_checkedCalc call in particular, it would make more sense
to have this part of the iy-methods more than anything else.  Since ppath
is generated externally now (compared to when the iy-methods last had an
updated error-handling), ppath needs also be checked with the fields
themselves anyways or you can have segfaults in the interpolations.


>
> When doing OEM, negative temperatures can pop up after each iteration
> and this should be checked. But not by planck, this should happen on a
> higher level.
>
> A simple solution here is to include a call of atmfields_checkedCalc
> etc. in inversion_iterate_agenda. The drawback is that some data will be
> checked over and over again despite not being changed.
>
> So it could be discussed if checks should be added to the OEM part. That
> data changed in an iteration, should be checked for unphysical values.
>
>
> That is, I think there are more things to discuss than you bring up in
> your email. So don't start anything big before we have reached a common
> view here.
>

Right, I won't.

With hope,
//Richard
___
arts_dev.mi mailing list
arts_dev.mi@lists.uni-hamburg.de