Hoss:Your are right. It has a version byte written first. This can be
used for any changes that come later..So , when we introduce any
change to the format we can rely on that. If/When we upgrade the
format we must ensure that it is backward compatible .
The format can be used by SolrJ clients as
Wow, did you notice this:
[row,col]:[1,920837]
Does that strike you as a very looong line? I wonder if you are hitting
some kind of an XML parser limit or bug.
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original Message
> From: Gaku Mak <[EMAIL PROTECTE
Hi all,
I have 2 solr masters that are only doing indexing and no querying (index
replicated to slave for searching). I noticed that the masters would
occasionally give the "javax.xml.stream.XMLStreamException: ParseError"
error and continued to work, until they started to give a lot of these and
Can i use RAMDirectory in SOLR?Thanks,
S
: This comment for the benefit of who is using distributed search:
: The protocol of communication has been xml for distributed search. For
: a good part of 1.3.
: It is now changed to a custom binary format (SOLR-486 ). So each shard
: participating in a distributed search must be using the same
: Is there any way I can group category information together? So that I
: know the category_id for the specific category_name?
when dealing with stored fields, Solr garuntees that the field values
arereturned in the order they were indexed - so if you have a multivalue
"cat_id" field and a mult
: Thanks for your suggestions. I have now tried installing Solr on two
: different machines. On one machine I installed the Ubuntu solr-tomcat5.5
: package, and on the other I simply dropped "solr.war"
: into /var/lib/tomcat5.5/webapps
This may be a silly question, but did you create a "solr home"
On Sun, Jun 1, 2008 at 11:39 AM, Kevin Xiao <[EMAIL PROTECTED]> wrote:
> Hi,
>
> We have large data to index, index size is about 40 G and getting bigger. We
> are try to figure out a way to speed up indexing.
>
> 1. Single solr server, multiple indexers, which will speed up document
> pars
Hi,
We have large data to index, index size is about 40 G and getting bigger. We
are try to figure out a way to speed up indexing.
1. Single solr server, multiple indexers, which will speed up document
parsing time, but I am not sure if single solr server can handle multiple
requests wit
On Sat, May 31, 2008 at 1:03 AM, Chris Hostetter
<[EMAIL PROTECTED]> wrote:
> The only reason *any* existing solr params use comma or white space
> seperated lists is because way, way, WAY back in the day [...]
I dunno... for something like "fl", it still seems a bit verbose to
list every field s
On Sun, Jun 1, 2008 at 5:20 AM, Gaku Mak <[EMAIL PROTECTED]> wrote:
[...]
> I also have some test script to query against the slave server; however,
> whenever during snapinstall, OOM would occur and the server is not very
> responsive (even with autowarm disabled). After a while (like couple
> mi
On Sun, Jun 1, 2008 at 4:43 AM, Gaku Mak <[EMAIL PROTECTED]> wrote:
> I have tried Yonik's suggestions with the following:
> 1) all autowarming are off
> 2) commented out firstsearch and newsearcher event handlers
> 3) increased autocommit interval to 600 docs and 30 minutes (previously 50
> docs a
Hi solr devs/users,
I'm setting up solr slave test server now (duo core with 2GB ram), and I'm
running into some issues with replication. The index being replication is
in the size of about 10G now, and the snappuller and snapinstaller are
scheduled to run every 30 mins using crontab.
Upon runn
Hi Yonik and other solr dev/users,
I have tried Yonik's suggestions with the following:
1) all autowarming are off
2) commented out firstsearch and newsearcher event handlers
3) increased autocommit interval to 600 docs and 30 minutes (previously 50
docs and 5 minutes)
In addition, I updated the
14 matches
Mail list logo