I sense an opportunity for some non-trivial mathematics to be applied to optimally setting these limits.
The obviously, horribly wrong approach would be to set a ceiling for all script memory use in a region and apportion that to parcel and avatar allotments such that no over-allocation could ever occur. This would create much lower limits than required for sub-ceiling operations almost all the time. Rather, the total amount of script memory that the limits permit may be two or five or ten times that ceiling and still only encounter the ceiling once every millenium or century or decade--all depending on the distribution of transient demand for the capacity being limited. So a Poisson or Erlang or some such distribution is relevant here. What's interesting is that there are (at least) two identifiable distributions: scripts in avatar attachments, and in parcel-resident objects. The former is much, much more transient, of course. It all feels a bit like engineering fibre capacity to optimally handle predicted demand for different telecom applications. Ignoring that new scripting functions may systematically change these demand distributions, this seems an interesting problem for somebody with the right background (not me!). Even if solving the optimization problem is judged overkill, I wanted to at least prevent that "obviously, horribly wrong approach." _______________________________________________ Policies and (un)subscribe information available here: http://wiki.secondlife.com/wiki/SLDev Please read the policies before posting to keep unmoderated posting privileges