mike.kl...@gmail.com
To: solr-user@lucene.apache.org
Sent: Monday, November 19, 2007 5:50:19 PM
Subject: Re: Any tips for indexing large amounts of data?
There should be some slowdown in larger indices as occasionally large
segment merge operations must occur. However, this shouldn't really
the same amount of data.
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original Message
From: Mike Klaas mike.kl...@gmail.com
To: solr-user@lucene.apache.org
Sent: Monday, November 19, 2007 5:50:19 PM
Subject: Re: Any tips for indexing large amounts of data
- Original Message
From: Mike Klaas mike.kl...@gmail.com
To: solr-user@lucene.apache.org
Sent: Monday, November 19, 2007 5:50:19 PM
Subject: Re: Any tips for indexing large amounts of data?
There should be some slowdown in larger indices as occasionally large
segment merge operations must
.
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original Message
From: Mike Klaas mike.kl...@gmail.com
To: solr-user@lucene.apache.org
Sent: Monday, November 19, 2007 5:50:19 PM
Subject: Re: Any tips for indexing large amounts of data?
There should be some
amount of data.
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original Message
From: Mike Klaas mike.kl...@gmail.com
To: solr-user@lucene.apache.org
Sent: Monday, November 19, 2007 5:50:19 PM
Subject: Re: Any tips for indexing large amounts of data
, November 21, 2007 1:24:05 PM
Subject: Re: Any tips for indexing large amounts of data?
Hi Otis,
Thanks for this. Are you using a flavor of linux and is it 64bit? How
much heap are you giving your jvm?
Thanks again
Brendan
On Nov 21, 2007, at 2:03 AM, Otis Gospodnetic wrote:
Mike is right about
- Original Message
From: Eswar K [EMAIL PROTECTED]
To: solr-user@lucene.apache.org
Sent: Wednesday, November 21, 2007 2:11:07 AM
Subject: Re: Any tips for indexing large amounts of data?
Hi otis,
I understand that is slightly off track question, but I am just
curious
to
know
@lucene.apache.org
Sent: Monday, November 19, 2007 5:50:19 PM
Subject: Re: Any tips for indexing large amounts of data?
There should be some slowdown in larger indices as occasionally large
segment merge operations must occur. However, this shouldn't really
affect overall speed too much.
You
- Solr - Nutch
- Original Message
From: Mike Klaas [EMAIL PROTECTED]
To: solr-user@lucene.apache.org
Sent: Monday, November 19, 2007 5:50:19 PM
Subject: Re: Any tips for indexing large amounts of data?
There should be some slowdown in larger indices as occasionally large
segment merge
Message
From: Mike Klaas [EMAIL PROTECTED]
To: solr-user@lucene.apache.org
Sent: Monday, November 19, 2007 5:50:19 PM
Subject: Re: Any tips for indexing large amounts of data?
There should be some slowdown in larger indices as occasionally large
segment merge operations must occur. However
Subject: Re: Any tips for indexing large amounts of data?
Hi otis,
I understand that is slightly off track question, but I am just curious
to
know the performance of Search on a 20 GB index file. What has been
your
observation?
Regards,
Eswar
On Nov 21, 2007 12:33 PM, Otis Gospodnetic
Hi,
Thanks for answering this question a while back. I have made some of
the suggestions you mentioned. ie not committing until I've finished
indexing. What I am seeing though, is as the index get larger (around
1Gb), indexing is taking a lot longer. In fact it slows down to a
crawl.
There should be some slowdown in larger indices as occasionally large
segment merge operations must occur. However, this shouldn't really
affect overall speed too much.
You haven't really given us enough data to tell you anything useful.
I would recommend trying to do the indexing via a
Thanks so much for your suggestions. I am attempting to index 550K
docs at once, but have found I've had to break them up into smaller
batches. Indexing seems to stop at around 47K docs (the index reaches
264M in size at this point). The index eventually itself grows to
about 2Gb. I am
to faceted browse.
Jeryl Cook
To: solr-user@lucene.apache.org From: [EMAIL PROTECTED] Subject: Any tips
for indexing large amounts of data? Date: Wed, 31 Oct 2007 10:30:50 -0400
Hi, I am creating an index of approx 500K documents. I wrote an indexing
program using embeded solr: http
Hi,
I am creating an index of approx 500K documents. I wrote an indexing
program using embeded solr: http://wiki.apache.org/solr/EmbeddedSolr
and am seeing probably a 10 fold increase in indexing speeds. My
problem is though, that if I try to reindex say 20K docs at a time it
slows down
Greetings Brendan,
In the solrconfig.xml file, under the updateHandler, is an auto commit
statement.
It looks like:
autoCommit
maxDocs1000/maxDocs
maxTime1000/maxTime
/autoCommit
I would think you would see better performance by allowing auto commit to
handle the commit
: currently batch my updates in lots of 100 and between batches I close and
: reopen the connection to solr like so:
: private void closeConnection() {
: solrCore.close();
: solrCore = null;
: logger.debug(Closed solr connection);
: }
:
: Does anyone have any
: I would think you would see better performance by allowing auto commit
: to handle the commit size instead of reopening the connection all the
: time.
if your goal is fast indexing, don't use autoCommit at all ... just
index everything, and don't commit until you are completely done.
19 matches
Mail list logo