Victoria,

No idea about this? Does this happen for a case that worked with robuts=0? Or maybe robust=1 makes some other problem to appear?

Freddy: You had some problem with RT4? Do you remember if it was this?

Bye,

Patrick



On 2019-02-11 20:55, Victoria Sol Galligani wrote:
Thanks Patrick!!! When I use robust=1 I get

The number of pages in *surface_reflectivity* should
match length of *f_grid* or be 1.
  length of *f_grid* : 1
  dimension of *surface_reflectivity* : 4

What is the robust condition that can trigger this error?

Thanks Patrick!

On Sat, Feb 9, 2019 at 5:22 AM Patrick Eriksson <patrick.eriks...@chalmers.se <mailto:patrick.eriks...@chalmers.se>> wrote:

    Victoria,

    The warning about normalization is common, so that's known. But not the
    error. The error should in principle be tracked down, but I have no
    time
    for that. In addition, there should hopefully be a new version of the
    RT4 interface at some point where the normalization not will be an
    issue
    (the guys in Hamburg is working on this, but will take time before the
    work is done).

    So let's see if there is a workaround. Are you using the robust option?
    If not, try it.

    Some time since I run RT4, but I think I had nstreams=8, robust=1 and
    auto_inc_nstreams=16. Try something like this. But note that the
    normalization warning is then swept under the carpet and the
    accuracy of
    some calculations could be a bit poor.

    Bye,

    Patrick





    On 2019-02-08 20:31, Victoria Sol Galligani wrote:
     > Hi! I'm having some trouble running RT4 with 2 stokes dimensions and
     > azimuthally random frozen particles (with dielectric properties
     > calculated from the maxwell garnett formula) for frequencies up
    to 170 GHz.
     >
     > For small particles there are no problems, but when I increase
    particles
     > (still fairly small particules, diameter = 300 um), I start getting
     > errors abour the scattering matrix normalization:
     >
     > Bulk scattering matrix normalization deviates significantly
     > from expected value (13.4591%, resulting in albedo deviation of
    0.105869).
     > Something seems wrong with your scattering data  (did you run
     > *scat_dataCheck*?)
     > or your RT4 setup (try increasing *nstreams* and in case of randomly
     > oriented particles possibly also pfct_aa_grid_size).
     >
     > Could not increase nstreams sufficiently (current: 18)
     > to satisfy scattering matrix norm at f[3]=166 GHz.
     > Try higher maximum number of allowed streams (ie. higher
     > auto_inc_nstreams than 19).
     >
     > Trying with higher number of nstreams, or trying to use
     > auto_inc_nstreams takes me to the following error:
     >
     > Assertion failed: (p < mpr.mextent), function operator(), file
     >
    /Users/victoria.galligani/Work/Software/ARTS/arts_DEC2018/src/matpackV.h,
    line
     > 374.
     > Playing with the angular grids yields the same errors. Does
    anyone have
     > any comments or suggestions ? Has anyone encountered this before?
     > I have been trying also different versions of ARTS. Mainly
     > arts-2-3-1171, but also arts-2-3-849.
     >
     > Thank you in advance!
     >
     > Victoria
     >
     >
     > _______________________________________________
     > arts_users.mi mailing list
     > arts_users.mi@lists.uni-hamburg.de
    <mailto:arts_users.mi@lists.uni-hamburg.de>
     > https://mailman.rrz.uni-hamburg.de/mailman/listinfo/arts_users.mi
     >

_______________________________________________
arts_users.mi mailing list
arts_users.mi@lists.uni-hamburg.de
https://mailman.rrz.uni-hamburg.de/mailman/listinfo/arts_users.mi

Reply via email to