Yes Erick,
I did create an artificial load test with 30 users concurrently doing search
(around 28000 samples of actual queries). With 1.4.1, the test completes
within 3hrs without any failures (with SOLR1.2.1 it wouldn't match with this
performance, i.e., in 3 hrs it could only do 9700 samples).
Or another way of saying this is - what is the maximum throughput you
get from the system (qps / indexing speed, etc) since that is what you
really (should) care about - and how does it compare to the previous setup?
-Mike
On 6/15/2011 3:52 PM, Erick Erickson wrote:
Yes, 100% CPU utilization
Yes, 100% CPU utilization will affect other processes, but
you've created an artificial situation with your load testing,
so I don't think it counts...
What kind of cpu utilization do you see when you simulate your
actual load rather than querying as fast as you can? That's a
more relevant number
Hi Yonik,
Thanx for the prompt reply. This is a relief :)
Just 1 more question. Wouldn't the 100% CPU load would affect the system, as
system process would starve for the CPU?
I tried the load test 1st with 4-cores and then with 8-cores, still the CPU
usage was reaching 100%
We have index of ab
On Wed, Jun 15, 2011 at 2:21 PM, pravesh wrote:
> I would need some help in minimizing the CPU load on the new system. Could
> possibly NIOFSDirectory attributes to high CPU?
Yes, it's a feature! The CPU is only higher because the threads
aren't blocked on IO as much.
So the increase in CPU you
Hi,
I'm planning to upgrade my system from SOLR1.2.1 to SOLR1.4.1 version.
We had done some lucene level optimizations on the SOLR slaves in the
earlier system(1.2.1), like:
1. removed the synchronized block from the SegmentReader class's
isDeleted() method
2. removed the synchronized block fro