Frank Warmerdam wrote:
Pascal,
You haven't provided enough information to respond to the question.
..
For a complex map with a lot of vector information coming out of
PostGIS on modest hardware it isn't bad.  But the missing variable
is the vector density in these map views.

Definitely.

It'd also be helpful, Pascal, to know whether and how you've indexed the PostGIS data, and whether you've analyze-ed it or vacuum-ed it or cluster-ed it, how much memory you've thrown at it. The PostGIS end of your setup is in itself an entire subsystem worthy of examination and optimization.

Then there's the setup of your classes and what sort of expressions you're using to classify, whether you have labels enabled, and so on. The number of layers isn't necessarily as important as the total number of classes, combined with how many classes will be examined for each vector feature before finding a match and moving on to the next point. (e.g. put the class that matches the most records at the top)


Personally, I think that benchmark-style stats such as "3.9 requests per second" are rarely accurate, since you didn't say how you got the figure. If it takes 0.25 seconds to render a map, and you could have an end-user experience of maps reloading in half a second, then that's pretty nice. It doesn't *necessarily* mean that you can only serve 4 requests in one second, either; unless your tests really were doing parallel requests and averaging the time, etc. And if the stats were accurate, I'd say that 4 requests per second or 345,000 requests per say, is pretty nice for what is now considered an entry-level PC!

--
Gregor Mosheh / Greg Allensworth
System Administrator, HostGIS cartographic development & hosting services
http://www.HostGIS.com/

"Remember that no one cares if you can back up,
 only if you can restore." - AMANDA

Reply via email to