just FYI, but I tried to champion DateTime during a massive DT rewrite at
work and lost.  the biggest gripe was that the objects were "insanely
large".  large in terms of memory per object and (more important to them,
apparently) large enough that it made frequent Data::Dumper dumps difficult
to parse during debugging.

after tracking down the "insanely large" issue, it seems that initially the
objects weren't what _I_ might consider large (maybe half a screen with
dumper) but that on any subsequent manipulations things spiraled out of
control.  IIRC it was all the zone data caching embedded within the object.

now, I won't argue one way or the other about which is better (larger
objects versus more lookups) especially since I have no benchmarks to back
anything up :)  but if you're interested in what prevented "us" from using
DT across the board that was probably the top of the list.

> The first question to answer is what are people doing with these
> objects? 

in our case, creating many, many, many of them.  more than you can possibly
imagine.  think one object with a few dozen "time" representations held
within it, then a few thousand of those floating around at any given second.
all within a mod_perl process where memory, not lookups, was the main
(perceived) concern.

again, not arguing over which is better, but having _all_ the zone data
cached had the appearance of being wasteful for us - for the most part we
would be converting between _maybe_ a dozen time zones tops, more likely
around the 6 or so surrounding the US and UK.  so being able to cache only
the data we know we needed would have gone a long way toward making people
_feel_ like DT was lean.

just FYI, really - nothing to get in any kind of flame war about :)

--Geoff

Reply via email to