Hi,
I'm running solr in tomcat. I am trying to upgrade to solr 4.4 but I
can't get it to work. If someone can point me at what I'm doing wrong.
tomcat context:
Context docBase=/opt/solr4.4/dist/solr-4.4.0.war debug=0
crossContext=true
Environment name=solr/home type=java.lang.String
Hi,
I have setup SolrCloud with tomcat. I use solr 4.1.
I have zookeeper running on 192.168.1.10.
A tomcat running solr_myidx on 192.168.1.10 on port 8080.
A tomcat running solr_myidx on 192.168.1.11 on port 8080.
My solr.xml is like this:
?xml version=1.0 encoding=UTF-8 ?
solr persistent=true
Hi,
In the solr admin web interface, when looking at the statistics of a
collection (this page: http://{ip}:8080/{index}/#/collection1), there is
Current under Optimized.
What does it mean?
Thanks.
This transmission is strictly confidential, possibly legally privileged, and
intended solely
Did you find anything? I have the same problem but it's on update
requests only.
The error comes from the solrj client indeed. It is solrj logging this
error. There is nothing in solr itself and it does the update correctly.
It's fairly small simple documents being updated.
On 04/15/2013
the issue for me (maybe I was hitting a
GET limit somewhere?).
-Luis
On Tue, Apr 16, 2013 at 7:38 AM, Marc des Garets m...@ttux.net wrote:
Did you find anything? I have the same problem but it's on update requests
only.
The error comes from the solrj client indeed. It is solrj logging
config.
On 04/10/2013 07:38 PM, Shawn Heisey wrote:
On 4/10/2013 9:48 AM, Marc Des Garets wrote:
The JVM behavior is now radically different and doesn't seem to make
sense. I was using ConcMarkSweepGC. I am now trying the G1 collector.
The perm gen went from 410Mb to 600Mb.
The eden space usage
index size and what is your performance measure as query
per second?
2013/4/11 Marc Des Garets marc.desgar...@192.com
Big heap because very large number of requests with more than 60 indexes
and hundreds of million of documents (all indexes together). My problem
is with solr 4.1. All is perfect
making
sure that the update log is not enabled (or make sure you do hard commits
relatively frequently rather than only soft commits.)
-- Jack Krupansky
-Original Message-
From: Marc Des Garets
Sent: Thursday, April 11, 2013 3:07 AM
To: solr-user@lucene.apache.org
Subject: Re
Hi,
I run multiple solr indexes in 1 single tomcat (1 webapp per index). All
the indexes are solr 3.5 and I have upgraded few of them to solr 4.1
(about half of them).
The JVM behavior is now radically different and doesn't seem to make
sense. I was using ConcMarkSweepGC. I am now trying the G1
Hi,
I have a simple field defined like this:
fieldtype name=text class=solr.TextField
analyzer class=org.apache.lucene.analysis.standard.StandardAnalyzer/
/fieldtype
Which I use here:
field name=middlename type=text indexed=true stored=true
required=false /
In solr 1.4, I
/fieldtype
Steve
-Original Message-
From: Marc Des Garets [mailto:marc.desgar...@192.com]
Sent: Friday, September 09, 2011 6:21 AM
To: solr-user@lucene.apache.org
Subject: question about StandardAnalyzer, differences between solr 1.4
and solr 3.3
Hi,
I have a simple field defined
Hi,
I am doing a really simple query on my index (it's running in tomcat):
http://host:8080/solr_er_07_09/select/?q=hash_id:123456
I am getting the following exception:
HTTP Status 500 - null java.lang.IllegalArgumentException at
java.nio.Buffer.limit(Buffer.java:249) at
Hello,
On the solr wiki, here:
http://wiki.apache.org/solr/SolrPerformanceFactors
It is written:
mergeFactor Tradeoffs
High value merge factor (e.g., 25):
Pro: Generally improves indexing speed
Con: Less frequent merges, resulting in a collection with more index
files which may slow
Perfect. Thank you for your help.
-Original Message-
From: Shalin Shekhar Mangar [mailto:shalinman...@gmail.com]
Sent: 08 March 2010 12:57
To: solr-user@lucene.apache.org
Subject: Re: question about mergeFactor
On Mon, Mar 8, 2010 at 5:31 PM, Marc Des Garets
marc.desgar...@192.comwrote
Just curious, have you checked if the hanging you are experiencing is not
garbage collection related?
-Original Message-
From: Erick Erickson [mailto:erickerick...@gmail.com]
Sent: 13 January 2010 13:33
To: solr-user@lucene.apache.org
Subject: Re: Problem comitting on 40GB index
That's
...@gmail.com]
Sent: 12 January 2010 07:49
To: solr-user@lucene.apache.org
Subject: Re: update solr index
On Mon, Jan 11, 2010 at 7:42 PM, Marc Des Garets
marc.desgar...@192.comwrote:
I am running solr in tomcat and I have about 35 indexes (between 2 and
80 millions documents each). Currently if I try
Hi,
I am running solr in tomcat and I have about 35 indexes (between 2 and
80 millions documents each). Currently if I try to update few documents
from an index (let's say the one which contains 80 millions documents)
while tomcat is running and therefore receiving requests, I am getting
few very
Hi,
I am experiencing a problem with an index of about 80 millions documents
(41Gb). I am trying to update documents in this index using Solrj.
When I do:
solrServer.add(docs); //docs is a ListSolrInputDocument that contains
1000 SolrInputDocument (takes 36sec)
Subject: Re: very slow add/commit time
How many MB have you set of cache on your solrconfig.xml?
On Tue, Nov 3, 2009 at 12:24 PM, Marc Des Garets
marc.desgar...@192.comwrote:
Hi,
I am experiencing a problem with an index of about 80 millions
documents
(41Gb). I am trying to update
19 matches
Mail list logo