Hi, i'm currently implementing an interface to the RT4 solver and am testing it. that was at least the original plan. partly it however turns out to be more of "fall into the traps" and "stumble upon issues" with other the solver (which i intended to use as reference)...
current issue i stumbled upon is that there seems to be no (sufficiently) rigid test on validity (or at least eligibility/adequacy/proper qualification) of the scattering data (scat_data). i've created my scat_data from ARTS' TMatrix interface. Happened that one of the particles was too challenging for TMatrix and produced a couple of NaN and also negative extinction and absorption coefficients (K11 and alpha1). while NAN could be avoided (equ. to a TMatrix fail), it's hardly possible for the negatives (they are "regular" TMatrix output. ARTS scatt solvers reacted very different on the presence of these invalid data: - RT4 gave a runtime error due to scatt matrix normalization being too far off (guess, Disort would do the same. wasn't tested here as I used oriented particles, which aren't handled by Disort). - DOIT ran through providing results that looked not immediately suspicious :-O - MC ran into an assertion within an interpolation. that's quite unsatisfactory, i think, and should be handled in some consistent manner, i think. question is how. some ideas below. do you have some further ideas or suggestions or comments? appreciate any input. wishes, Jana my thoughts/ideas: - leave it to each solver to check for that (but then we need to go over them to do that)? - make an additional check method and another check variable for the scat_data? there is already a scat_dataCheck WSM. which is rarely used. it e.g. checks that scat_data cover the required frequencies, but also the scatt matrix normalization, the latter only available for random orientation, though. in my experience it hasn't been too helpful (data coming from atmlab Mie interface - as well as from ARTS' Tmatrix interface as i learned these days - frequently don't pass the normalisation check. which is beside others due to the type of quadrature used. according to my experience, such norm issues are better handled by each solver separately), and since it's not mandatory, i avoid it. but it would be an option to modify this (make the norm check optional, instead check for nan and negative values) and make it mandatory (through a checked flag). - an issue is, of course, that one does not really want to check data, that is frequently used, each time again (e.g. the arts-xml-data contents, data from the future ice/snow SSP database we are creating...) for those invalid entries. so maybe giving the data structure itself a flag and providing a WSM that does that checking and resets the flag? the above checked method could e.g. look for this flag and apply the whole check suite only on so far unchecked data. -- ===================================================================== Jana Mendrok, Ph.D. (Project Assistent) Chalmers University of Technology Earth and Space Sciences SE-412 96 Gothenburg, Sweden Phone : +46 (0)31 772 1883 =====================================================================
_______________________________________________ arts_dev.mi mailing list [email protected] https://mailman.rrz.uni-hamburg.de/mailman/listinfo/arts_dev.mi
