Re: Indexing logs files of thousands of GBs

2013-10-30 Thread keshari.prerna
Hello, As suggested by Chris, now I am accessing the files using java program and creating SolrInputDocument, but i ran into this exception while doing server.add(document). When i tried to increase ramBufferSizeMB, it doesn't let me make it more than 2 gig.

Re: Indexing logs files of thousands of GBs

2013-10-30 Thread keshari.prerna
I have set at multipartUploadLimitInKB parameter to 10240 (which was 2048 earlier) multipartUploadLimitInKB=10240. Now it gives following error for same files at place. http://localhost:8983/solr/logsIndexing returned non ok status:500, message:the request was rejected because its size

Indexing logs files of thousands of GBs

2013-10-22 Thread keshari.prerna
Hello, I am tried to index log files (all text data) stored in file system. Data can be as big as 1000 GBs or more. I am working on windows. A sample file can be found at https://www.dropbox.com/s/mslwwnme6om38b5/batkid.glnxa64.66441 I tried using FileListEntityProcessor with

Re: Replace NULL with 0 while Indexing

2013-10-15 Thread keshari.prerna
Thank you everyone, I think COALESCE(duration, 0) and ISNULL will solve my problem. -- View this message in context: http://lucene.472066.n3.nabble.com/Replace-NULL-with-0-while-Indexing-tp4095059p4095678.html Sent from the Solr - User mailing list archive at Nabble.com.

Replace NULL with 0 while Indexing

2013-10-11 Thread keshari.prerna
Hello, One of my indexing field have NULL values and i want it to be replaces with 0 while indexing itself. So that when i search after indexing it gives me 0 instead of NULL. This is my data-config.xml and duration is the field which has null values. dataConfig dataSource type=JdbcDataSource