In talking to a google engineer a while back, he mentioned offhand
that, although he had done his particular google project in python,
they were strongly considering moving it to java due to scalability
issues. This isn't the first time I have heard the implication that
for truly massive applications, java is really the standard.

I have been reading a bunch lately, everything from shards and
hibernate to youtube scalability (they have a python app under the
hood), and I can't seem to find a simple explanation of what makes
java better at scaling. I think alot of java based sites tend to be
too verbose and acronym happy, and I am pretty certain there is a
simple way to explain it. There are plenty of articles (mostly by the
RoR folks) on why java isn't better at scaling, but none that I have
found so far deal with systems on the google scale.

I build applications in Django, and, combined with caching and load
balancing I can see it handling quite a bit. My question is, does
anyone know of (or can write) a good article or explanation of why so
many people are so adamant about java's ability to scale?

Thanks!
-Nikolaj

-- 
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-lpsg

Reply via email to