Dear All,

I have a persistent problem that some of you might have encountered and I 
am hoping if there could be some that could help me.

I am developing a relatively simple Cellular Automata. Model runs fine 
except when I ask patches to calculate mean (or even sum) of more than 8 
neighbors. The problem appears that the decimal points usually 10th decimal 
place onwards keeps changing (EVEN) after simulation has stopped running! 
As a matter of fact, when I calculate mean just after initialization 
(therefore only SETUP, but before GO) at tick = 0, I would get 
constantly-changing values for the calculation output. This is quite weird 
to me and I have tried a few other PCs. Mac and LINUX not yet, but I do not 
expect problem to be that complicated.

Naturally, I used the native in-radius function to increase neighborhood 
sizes (in radial fashion) and also workarounds such as patches-at to create 
a larger Von Neumann neighborhood, but the same problems occur. Problem is 
avoided when native neighbor and neighbor4 functions are used.

I know I could potentially use Precision to constrain the values, but I 
realised that at the back (when i did a check) the values are still 
changing.


Deeply appreciate any insight on this,

Best Regards,

Phang

-- 
You received this message because you are subscribed to the Google Groups 
"netlogo-devel" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to