Retrieving indexed field data

2010-05-03 Thread Licinio Fernández Maurelo
Hi folks,

i'm wondering if there is a way to retrieve the indexed data. The reason is
that i'm working on a solrj-based tool that copies one index data into other
(allowing you to perform changes in docs ). I know i can't perform any
change in an indexed field, just want to copy the chunk of bytes ..

Am i missing something? Indexing generated data can't be retrieved anyway?

Thanks in advance .

-- 
Lici
~Java Developer~


Some questions on solr replication backup feature

2010-02-04 Thread Licinio Fernández Maurelo
Hi folks,

as we're moving to solr 1.4 replication, i want to know about backups.

Questions
-

1. Properties that can be set to configure this feature (only know
backupAfter)
2. Is it an incremental backup or a full index snapshoot?

Thx

-- 
Lici
~Java Developer~


Re: Some questions on solr replication backup feature

2010-02-04 Thread Licinio Fernández Maurelo
I've made a  backup request to my local solr server, it works but .. can i
set snapshoots dir path?

El 4 de febrero de 2010 16:54, Licinio Fernández Maurelo 
licinio.fernan...@gmail.com escribió:

 Hi folks,

 as we're moving to solr 1.4 replication, i want to know about backups.

 Questions
 -

 1. Properties that can be set to configure this feature (only know
 backupAfter)
 2. Is it an incremental backup or a full index snapshoot?

 Thx

 --
 Lici
 ~Java Developer~




-- 
Lici
~Java Developer~


Mavenizing solr webapp

2010-01-28 Thread Licinio Fernández Maurelo
Hi everybody. I'm trying to build apache-solr *webapp* (not the whole
project) using maven . Also want to reuse the build.xml ant file.

The directory structure is:

+build
+client
+contrib
.
+src
 +webapp/src --webapp code
+dist --generated artifacts by the ant script
  --must be copied to the webapp WEB-INF/lib
  --some of them are also needed for webapp code compilation

I've successfully called the ant target that compiles and populates the dist
dir.

What I need is to:

1) Include some of the jars that reside in the dist directory to compile the
webapp code.

2) Package some of the jars in the war artifact maven builds.

Any help would be much appreciated


-- 
Lici
~Java Developer~


Field collapsing patch error

2010-01-19 Thread Licinio Fernández Maurelo
Hi folks,

i've downloaded solr release 1.4 and tried to apply  latest field collapsing
patchhttps://issues.apache.org/jira/secure/attachment/12428902/SOLR-236.patchi've
found . Found errors :

d...@backend05:~/workspace/solr-release-1.4.0$ patch -p0 -i SOLR-236.patch

patching file src/test/test-files/solr/conf/solrconfig-fieldcollapse.xml
patching file src/test/test-files/solr/conf/schema-fieldcollapse.xml
patching file src/test/test-files/solr/conf/solrconfig.xml
patching file src/test/test-files/fieldcollapse/testResponse.xml
patching file
src/test/org/apache/solr/search/fieldcollapse/FieldCollapsingIntegrationTest.java
patching file
src/test/org/apache/solr/search/fieldcollapse/DistributedFieldCollapsingIntegrationTest.java
patching file
src/test/org/apache/solr/search/fieldcollapse/NonAdjacentDocumentCollapserTest.java

patching file
src/test/org/apache/solr/search/fieldcollapse/AdjacentCollapserTest.java

patching file
src/test/org/apache/solr/handler/component/CollapseComponentTest.java

patching file
src/test/org/apache/solr/client/solrj/response/FieldCollapseResponseTest.java

patching file
src/java/org/apache/solr/search/DocSetAwareCollector.java

patching file
src/java/org/apache/solr/search/fieldcollapse/CollapseGroup.java

patching file
src/java/org/apache/solr/search/fieldcollapse/DocumentCollapseResult.java

patching file
src/java/org/apache/solr/search/fieldcollapse/DocumentCollapser.java

patching file
src/java/org/apache/solr/search/fieldcollapse/collector/CollapseCollectorFactory.java

patching file
src/java/org/apache/solr/search/fieldcollapse/collector/DocumentGroupCountCollapseCollectorFactory.java

patching file
src/java/org/apache/solr/search/fieldcollapse/collector/aggregate/AverageFunction.java

patching file
src/java/org/apache/solr/search/fieldcollapse/collector/aggregate/MinFunction.java

patching file
src/java/org/apache/solr/search/fieldcollapse/collector/aggregate/SumFunction.java

patching file
src/java/org/apache/solr/search/fieldcollapse/collector/aggregate/MaxFunction.java

patching file
src/java/org/apache/solr/search/fieldcollapse/collector/aggregate/AggregateFunction.java

patching file
src/java/org/apache/solr/search/fieldcollapse/collector/CollapseContext.java

patching file
src/java/org/apache/solr/search/fieldcollapse/collector/DocumentFieldsCollapseCollectorFactory.java
patching file
src/java/org/apache/solr/search/fieldcollapse/collector/AggregateCollapseCollectorFactory.java

patching file
src/java/org/apache/solr/search/fieldcollapse/collector/CollapseCollector.java

patching file
src/java/org/apache/solr/search/fieldcollapse/collector/FieldValueCountCollapseCollectorFactory.java

patching file
src/java/org/apache/solr/search/fieldcollapse/collector/AbstractCollapseCollector.java

patching file
src/java/org/apache/solr/search/fieldcollapse/AbstractDocumentCollapser.java

patching file
src/java/org/apache/solr/search/fieldcollapse/NonAdjacentDocumentCollapser.java

patching file
src/java/org/apache/solr/search/fieldcollapse/AdjacentDocumentCollapser.java

patching file
src/java/org/apache/solr/search/fieldcollapse/util/Counter.java

patching file
src/java/org/apache/solr/search/SolrIndexSearcher.java

patching file
src/java/org/apache/solr/search/DocSetHitCollector.java

patching file
src/java/org/apache/solr/handler/component/CollapseComponent.java

patching file
src/java/org/apache/solr/handler/component/QueryComponent.java

Hunk #1 FAILED at
522.

1 out of 1 hunk FAILED -- saving rejects to file
src/java/org/apache/solr/handler/component/QueryComponent.java.rej

patching file
src/java/org/apache/solr/util/DocSetScoreCollector.java

patching file
src/common/org/apache/solr/common/params/CollapseParams.java

patching file src/solrj/org/apache/solr/client/solrj/SolrQuery.java
Hunk #1 FAILED at 17.
Hunk #2 FAILED at 50.
Hunk #3 FAILED at 76.
Hunk #4 FAILED at 148.
Hunk #5 FAILED at 197.
Hunk #6 succeeded at 510 (offset -155 lines).
Hunk #7 succeeded at 566 (offset -155 lines).
5 out of 7 hunks FAILED -- saving rejects to file
src/solrj/org/apache/solr/client/solrj/SolrQuery.java.rej
patching file
src/solrj/org/apache/solr/client/solrj/response/QueryResponse.java
Hunk #1 succeeded at 17 with fuzz 1.
Hunk #2 FAILED at 42.
Hunk #3 FAILED at 58.
Hunk #4 succeeded at 117 with fuzz 2 (offset -8 lines).
Hunk #5 succeeded at 315 with fuzz 2 (offset 17 lines).
2 out of 5 hunks FAILED -- saving rejects to file
src/solrj/org/apache/solr/client/solrj/response/QueryResponse.java.rej
patching file
src/solrj/org/apache/solr/client/solrj/response/FieldCollapseResponse.java

Any ideas?

-- 
Lici
~Java Developer~


Re: solr perf

2009-12-21 Thread Licinio Fernández Maurelo
not bad advise ;-)

2009/12/20 Walter Underwood wun...@wunderwood.org

 Here is an idea. Don't make one core per user.  Use a field with a user id.

 wunder

 On Dec 20, 2009, at 12:38 PM, Matthieu Labour wrote:

  Hi
  I have a slr instance in which i created 700 core. 1 Core per user of my
  application.
  The total size of the data indexed on disk is 35GB with solr cores going
  from 100KB and few documents to 1.2GB and 50 000 documents.
  Searching seems very slow and indexing as well
  This is running on a EC2 xtra large instance (6CPU, 15GB Memory, Raid0
 disk)
  I would appreciate if anybody has some tips, articles etc... as what to
 do
  to understand and improve performance
  Thank you




-- 
Lici
~Java Developer~


Calculate term vector

2009-12-21 Thread Licinio Fernández Maurelo
Hi folks,

how can i get term vector from a custom solr query via http request? is this
possible?

-- 
Lici
~Java Developer~


Re: Solr Configuration Management

2009-12-10 Thread Licinio Fernández Maurelo
Hi there,

as i know there are two solr configuration properties files,
dataimport.properties and solrcore.properties.

What i want is to intercept solr app to use my own property definition
files, is there any global bean (per core)

which contains this data? Where in the code are the properties setted? Is
there any open jira issue to allow

global core configuration (dataimport and core)?

Thx

El 10 de diciembre de 2009 16:35, Licinio Fernández Maurelo 
licinio.fernan...@gmail.com escribió:



 -- Mensaje reenviado --
 De: Licinio Fernández Maurelo licinio.fernan...@gmail.com
 Fecha: 27 de octubre de 2009 09:50
 Asunto: Re: Solr Configuration Management
 Para: solr-user@lucene.apache.org, noble.p...@gmail.com



  are you referring to DIH?
 yes

 2009/10/27 Noble Paul നോബിള്‍ नोब्ळ् noble.p...@corp.aol.com

 2009/10/26 Licinio Fernández Maurelo licinio.fernan...@gmail.com:
  Hi there,
 
  i must enhance solr config deploys.
 
  I have a configuration file per environment and per role (Master-Slave)
 so i
  want to separate DataSource definitions from the solrconfig.xml . Where
 can
  i put them?
 are you referring to DIH?

 
  Same behaviour is desired for Master-Slave conf diffs.
 you can drop in all your custom properties in a solrcore.properties
 file (placed in conf dir) and can have different properties files for
 master and slave . These properties can be directly be referred from
 solrconfig
 
  Any help would be much appreciatted ...
 
 
  --
  Lici
 



 --
 -
 Noble Paul | Principal Engineer| AOL | http://aol.com




 --
 Lici



 --
 Lici
 ~Java Developer~




-- 
Lici
~Java Developer~


Commit error

2009-11-11 Thread Licinio Fernández Maurelo
Hi folks,

i'm getting this error while committing after a dataimport of only 12 docs
!!!

Exception while solr commit.
java.io.IOException: background merge hit exception: _3kta:C2329239
_3ktb:c11-_3ktb into _3ktc [optimize] [mergeDocStores]
at org.apache.lucene.index.IndexWriter.optimize(IndexWriter.java:2829)
at org.apache.lucene.index.IndexWriter.optimize(IndexWriter.java:2750)
at
org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:401)
at
org.apache.solr.update.processor.RunUpdateProcessor.processCommit(RunUpdateProcessorFactory.java:85)
at
org.apache.solr.update.processor.LogUpdateProcessor.processCommit(LogUpdateProcessorFactory.java:138)
at
org.apache.solr.update.processor.UpdateRequestProcessor.processCommit(UpdateRequestProcessor.java:66)
at org.apache.solr.handler.dataimport.SolrWriter.commit(SolrWriter.java:170)
at org.apache.solr.handler.dataimport.DocBuilder.finish(DocBuilder.java:208)
at
org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:185)
at
org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:333)
at
org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:393)
at
org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:372)
Caused by: java.io.IOException: No hay espacio libre en el dispositivo
at java.io.RandomAccessFile.writeBytes(Native Method)
at java.io.RandomAccessFile.write(RandomAccessFile.java:499)
at
org.apache.lucene.store.SimpleFSDirectory$SimpleFSIndexOutput.flushBuffer(SimpleFSDirectory.java:191)
at
org.apache.lucene.store.BufferedIndexOutput.flushBuffer(BufferedIndexOutput.java:96)
at
org.apache.lucene.store.BufferedIndexOutput.flush(BufferedIndexOutput.java:85)
at
org.apache.lucene.store.BufferedIndexOutput.writeBytes(BufferedIndexOutput.java:75)
at org.apache.lucene.store.IndexOutput.writeBytes(IndexOutput.java:45)
at
org.apache.lucene.index.CompoundFileWriter.copyFile(CompoundFileWriter.java:229)
at
org.apache.lucene.index.CompoundFileWriter.close(CompoundFileWriter.java:184)
at
org.apache.lucene.index.SegmentMerger.createCompoundFile(SegmentMerger.java:217)
at org.apache.lucene.index.IndexWriter.mergeMiddle(IndexWriter.java:5089)
at org.apache.lucene.index.IndexWriter.merge(IndexWriter.java:4589)
at
org.apache.lucene.index.ConcurrentMergeScheduler.doMerge(ConcurrentMergeScheduler.java:235)
at
org.apache.lucene.index.ConcurrentMergeScheduler$MergeThread.run(ConcurrentMergeScheduler.java:291)

Index info: 2.600.000 docs | 11G size
System info: 15GB free disk space

When attempting to commit the disk usage increases until solr breaks ... it
looks like 15 GB is not enought space to do the merge | optimize

Any advice?

-- 
Lici


Re: Commit error

2009-11-11 Thread Licinio Fernández Maurelo
Thanks Israel, i've done a sucesfull import using optimize=false

2009/11/11 Israel Ekpo israele...@gmail.com

 2009/11/11 Licinio Fernández Maurelo licinio.fernan...@gmail.com

  Hi folks,
 
  i'm getting this error while committing after a dataimport of only 12
 docs
  !!!
 
  Exception while solr commit.
  java.io.IOException: background merge hit exception: _3kta:C2329239
  _3ktb:c11-_3ktb into _3ktc [optimize] [mergeDocStores]
  at org.apache.lucene.index.IndexWriter.optimize(IndexWriter.java:2829)
  at org.apache.lucene.index.IndexWriter.optimize(IndexWriter.java:2750)
  at
 
 
 org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:401)
  at
 
 
 org.apache.solr.update.processor.RunUpdateProcessor.processCommit(RunUpdateProcessorFactory.java:85)
  at
 
 
 org.apache.solr.update.processor.LogUpdateProcessor.processCommit(LogUpdateProcessorFactory.java:138)
  at
 
 
 org.apache.solr.update.processor.UpdateRequestProcessor.processCommit(UpdateRequestProcessor.java:66)
  at
  org.apache.solr.handler.dataimport.SolrWriter.commit(SolrWriter.java:170)
  at
  org.apache.solr.handler.dataimport.DocBuilder.finish(DocBuilder.java:208)
  at
 
 org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:185)
  at
 
 
 org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:333)
  at
 
 
 org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:393)
  at
 
 
 org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:372)
  Caused by: java.io.IOException: No hay espacio libre en el dispositivo
  at java.io.RandomAccessFile.writeBytes(Native Method)
  at java.io.RandomAccessFile.write(RandomAccessFile.java:499)
  at
 
 
 org.apache.lucene.store.SimpleFSDirectory$SimpleFSIndexOutput.flushBuffer(SimpleFSDirectory.java:191)
  at
 
 
 org.apache.lucene.store.BufferedIndexOutput.flushBuffer(BufferedIndexOutput.java:96)
  at
 
 
 org.apache.lucene.store.BufferedIndexOutput.flush(BufferedIndexOutput.java:85)
  at
 
 
 org.apache.lucene.store.BufferedIndexOutput.writeBytes(BufferedIndexOutput.java:75)
  at org.apache.lucene.store.IndexOutput.writeBytes(IndexOutput.java:45)
  at
 
 
 org.apache.lucene.index.CompoundFileWriter.copyFile(CompoundFileWriter.java:229)
  at
 
 
 org.apache.lucene.index.CompoundFileWriter.close(CompoundFileWriter.java:184)
  at
 
 
 org.apache.lucene.index.SegmentMerger.createCompoundFile(SegmentMerger.java:217)
  at org.apache.lucene.index.IndexWriter.mergeMiddle(IndexWriter.java:5089)
  at org.apache.lucene.index.IndexWriter.merge(IndexWriter.java:4589)
  at
 
 
 org.apache.lucene.index.ConcurrentMergeScheduler.doMerge(ConcurrentMergeScheduler.java:235)
  at
 
 
 org.apache.lucene.index.ConcurrentMergeScheduler$MergeThread.run(ConcurrentMergeScheduler.java:291)
 
  Index info: 2.600.000 docs | 11G size
  System info: 15GB free disk space
 
  When attempting to commit the disk usage increases until solr breaks ...
 it
  looks like 15 GB is not enought space to do the merge | optimize
 
  Any advice?
 
  --
  Lici
 


 Hi Licinio,

 During the the optimization process, the index size would be approximately
 double what it was originally and the remaining space on disk may not be
 enough for the task.

 You are describing exactly what could be going on
 --
 Good Enough is not good enough.
 To give anything less than your best is to sacrifice the gift.
 Quality First. Measure Twice. Cut Once.




-- 
Lici


Re: How to integrate Solr into my project

2009-11-03 Thread Licinio Fernández Maurelo
Hi Caroline,

i think that you must take an overview tour ;-) , solrj is just a solr java
client ...

Some clues:


   - Define your own index schema
http://wiki.apache.org/solr/SchemaXml(it's just like a SQL DDL) .
   - There are different ways to put docs in your index:
  - SolrJ (Solr client for java env)
  - DIH http://wiki.apache.org/solr/DataImportHandler (Data Import
  Handler) this one is prefered when doing a huge data import from
DB's, many
  source formats are supported.
   - Try to perform queries over your fancy-new index ;-). Learn about
   searching syntax and
facetinghttp://wiki.apache.org/solr/SolrFacetingOverview
   .






2009/11/3 Caroline Tan caroline@gmail.com

 Ya, it's a Java projecti just browse this site you suggested...
 http://wiki.apache.org/solr/Solrj

 Which means, i declared the dependancy to solr-solrj and solr-core jars,
 have those jars added to my project lib and by following the Solrj
 tutorial,
 i will be able to even index a DB table into Solr as well? thanks

 ~caroLine


 2009/11/3 Noble Paul നോബിള്‍ नोब्ळ् noble.p...@corp.aol.com

  is it a java project ?
  did you see this page http://wiki.apache.org/solr/Solrj ?
 
  On Tue, Nov 3, 2009 at 2:25 PM, Caroline Tan caroline@gmail.com
  wrote:
   Hi,
   I wish to intergrate Solr into my current working project. I've played
   around the Solr example and get it started in my tomcat. But the next
  step
   is HOW do i integrate that into my working project? You see, Lucence
   provides API and tutorial on what class i need to instanstiate in order
  to
   index and search. But Solr seems to be pretty vague on this..as it is a
   working solr search server. Can anybody help me by stating the steps by
   steps, what classes that i should look into in order to assimiliate
 Solr
   into my project?
   Thanks.
  
   regards
   ~caroLine
  
 
 
 
  --
  -
  Noble Paul | Principal Engineer| AOL | http://aol.com
 




-- 
Lici


Problems downloading lucene 2.9.1

2009-11-02 Thread Licinio Fernández Maurelo
Hi folks,

as we are using an snapshot dependecy to solr1.4, today we are getting
problems when maven try to download lucene 2.9.1 (there isn't a any 2.9.1
there).

Which repository can i use to download it?

Thx

-- 
Lici


Re: Problems downloading lucene 2.9.1

2009-11-02 Thread Licinio Fernández Maurelo
Thanks guys !!!

2009/11/2 Ryan McKinley ryan...@gmail.com


 On Nov 2, 2009, at 8:29 AM, Grant Ingersoll wrote:


 On Nov 2, 2009, at 12:12 AM, Licinio Fernández Maurelo wrote:

  Hi folks,

 as we are using an snapshot dependecy to solr1.4, today we are getting
 problems when maven try to download lucene 2.9.1 (there isn't a any 2.9.1
 there).

 Which repository can i use to download it?


 They won't be there until 2.9.1 is officially released.  We are trying to
 speed up the Solr release by piggybacking on the Lucene release, but this
 little bit is the one downside.


 Until then, you can add a repo to:

 http://people.apache.org/~mikemccand/staging-area/rc3_lucene2.9.1/maven/





-- 
Lici


Re: Problems downloading lucene 2.9.1

2009-11-02 Thread Licinio Fernández Maurelo
Well, i've solved this problem executing pre mvn install:install-file
-DgroupId=org.apache.lucene -DartifactId=lucene-analyzers -Dversion=2.9.1
-Dpackaging=jar -Dfile=path_to_jar/pre for each lucene-* artifact.

I think there must be an easier way to do this, am i wrong?

Hope it helps

Thx

El 3 de noviembre de 2009 08:03, Licinio Fernández Maurelo 
licinio.fernan...@gmail.com escribió:

 Thanks guys !!!

 2009/11/2 Ryan McKinley ryan...@gmail.com


 On Nov 2, 2009, at 8:29 AM, Grant Ingersoll wrote:


 On Nov 2, 2009, at 12:12 AM, Licinio Fernández Maurelo wrote:

  Hi folks,

 as we are using an snapshot dependecy to solr1.4, today we are getting
 problems when maven try to download lucene 2.9.1 (there isn't a any
 2.9.1
 there).

 Which repository can i use to download it?


 They won't be there until 2.9.1 is officially released.  We are trying to
 speed up the Solr release by piggybacking on the Lucene release, but this
 little bit is the one downside.


 Until then, you can add a repo to:

 http://people.apache.org/~mikemccand/staging-area/rc3_lucene2.9.1/maven/





 --
 Lici




-- 
Lici


Re: Solr Configuration Management

2009-10-27 Thread Licinio Fernández Maurelo
 are you referring to DIH?
yes

2009/10/27 Noble Paul നോബിള്‍ नोब्ळ् noble.p...@corp.aol.com

 2009/10/26 Licinio Fernández Maurelo licinio.fernan...@gmail.com:
  Hi there,
 
  i must enhance solr config deploys.
 
  I have a configuration file per environment and per role (Master-Slave)
 so i
  want to separate DataSource definitions from the solrconfig.xml . Where
 can
  i put them?
 are you referring to DIH?

 
  Same behaviour is desired for Master-Slave conf diffs.
 you can drop in all your custom properties in a solrcore.properties
 file (placed in conf dir) and can have different properties files for
 master and slave . These properties can be directly be referred from
 solrconfig
 
  Any help would be much appreciatted ...
 
 
  --
  Lici
 



 --
 -
 Noble Paul | Principal Engineer| AOL | http://aol.com




-- 
Lici


Solr Configuration Management

2009-10-26 Thread Licinio Fernández Maurelo
Hi there,

i must enhance solr config deploys.

I have a configuration file per environment and per role (Master-Slave) so i
want to separate DataSource definitions from the solrconfig.xml . Where can
i put them?

Same behaviour is desired for Master-Slave conf diffs.

Any help would be much appreciatted ...


-- 
Lici


Re: multicore query via solrJ

2009-10-23 Thread Licinio Fernández Maurelo
As no answer is given, I assume it's not possible. It will be great to code
a method like this

query(SolrServer,  ListSolrServer)



El 20 de octubre de 2009 11:21, Licinio Fernández Maurelo 
licinio.fernan...@gmail.com escribió:

 Hi there,
 is there any way to perform a multi-core query using solrj?

 P.S.:

 I know about this syntax:
 http://localhost:8983/solr/core0/select?shards=localhost:8983/solr/core0,localhost:8983/solr/core1q=
 but i'm looking for a more fancy way to do this using solrj (something like
 shards(query) )

 thx



 --
 Lici




-- 
Lici


Creating cores using SolrJ

2009-10-06 Thread Licinio Fernández Maurelo
Hi there,

i want to create cores using SolrJ, but i also want to create then in a
given datadir. How can i do this? Looking CoreAdminRequest methods i only
found:


   - createCore(name, instanceDir, server)
   - createCore(name, instanceDir, server, configFile, schemaFile)

None of above methods allow datadir param.

Thx

-- 
Lici


Mapping SolrDoc to SolrInputDoc

2009-09-16 Thread Licinio Fernández Maurelo
Hi there,

currently i'm working on a small app which creates an Embedded Solr Server,
reads all documents from one core and puts these docs into another one.

The purpose of this app is to apply (small) changes on schema.xml to indexed
data (offline) resulting a new index with documents updated to schema.xml
changes.

What i want to know is if there is an easy way to map SolrDoc  to
SolrInputDoc.

Any help would be much appreciated

-- 
Lici


Re: Mapping SolrDoc to SolrInputDoc

2009-09-16 Thread Licinio Fernández Maurelo
I'll try, thanks Martijn

2009/9/16 Martijn v Groningen martijn.is.h...@gmail.com

 Hi Licinio,

 You can use ClientUtils.toSolrInputDocument(...), that converts a
 SolrDocument to a SolrInputDocument.

 Martijn

 2009/9/16 Licinio Fernández Maurelo licinio.fernan...@gmail.com:
  Hi there,
 
  currently i'm working on a small app which creates an Embedded Solr
 Server,
  reads all documents from one core and puts these docs into another one.
 
  The purpose of this app is to apply (small) changes on schema.xml to
 indexed
  data (offline) resulting a new index with documents updated to schema.xml
  changes.
 
  What i want to know is if there is an easy way to map SolrDoc  to
  SolrInputDoc.
 
  Any help would be much appreciated
 
  --
  Lici
 



 --
 Met vriendelijke groet,

 Martijn van Groningen




-- 
Lici


Dealing with term vectors

2009-09-15 Thread Licinio Fernández Maurelo
Hi there,

i want to recover the term vectors from indexes not calculating then but
just only recovering  instead.

Some questions about this topic:


   1. When i put the termVector=true  option ... what's happening behind?
  1. Is Lucene storing the tv in the index?
  2. Is Lucene storing additional info to allow tv's calculation?
   2. Reading Solr 1.4 Enterprise Search book (amazing book!) found this: 
   In Solr 1.4, it is now possible to tell Lucene that a field should store
   these for efficient retrieval. Without them, the same information can be
   derived at runtime but that's slower (p. 286) - Does this mean that older
   Solr versions don't come with this functionality?
   3. Can tv component expose raw tem vectors for fields not marked wirh
   termVector=true?


Thx

-- 
Lici


Logging solr requests

2009-09-02 Thread Licinio Fernández Maurelo
Hi there,

i need to log solr requests on the fly , filter, transform them and finally
put them into an index.

Any advice on best way to implement such this behaviour?

Key points:

- I think that the use  of log files is discouraged, but i don't know if i
can modify solr settings to log to a server (via rmi or http)
- Don't want to drop down solr response performance

thx
-- 
Lici


JDWP Error

2009-08-26 Thread Licinio Fernández Maurelo
The servlet container (resin) where i deploy solr shows :

ERROR: transport error 202: bind failed: Address already in
use

ERROR: JDWP Transport dt_socket failed to initialize,
TRANSPORT_INIT(510)

JDWP exit error AGENT_ERROR_TRANSPORT_INIT(197): No transports
initialized
[../../../src/share/back/debugInit.c:690]

FATAL ERROR in native method: JDWP No transports initialized,
jvmtiError=AGENT_ERROR_TRANSPORT_INIT(197)

ERROR: transport error 202: bind failed: Address already in
use

ERROR: JDWP Transport dt_socket failed to initialize,
TRANSPORT_INIT(510)

JDWP exit error AGENT_ERROR_TRANSPORT_INIT(197): No transports
initialized
[../../../src/share/back/debugInit.c:690]

FATAL ERROR in native method: JDWP No transports initialized,
jvmtiError=AGENT_ERROR_TRANSPORT_INIT(197)


then, when we want to stop resin it doesn't works, any advice?

thx

-- 
Lici


Re: Replication over multi-core solr

2009-08-19 Thread Licinio Fernández Maurelo
Hi Vivek,
currently we want to add cores dynamically when the active one reaches
some capacity,
can you give me some hints to achieve such this functionality? (Just
wondering if you have used shell-scripting or you have code some 100%
Java based solution)

Thx


2009/8/19 Noble Paul നോബിള്‍  नोब्ळ् noble.p...@corp.aol.com:
 On Wed, Aug 19, 2009 at 2:27 AM, vivek sarvivex...@gmail.com wrote:
 Hi,

  We use multi-core setup for Solr, where new cores are added
 dynamically to solr.xml. Only one core is active at a time. My
 question is how can the replication be done for multi-core - so every
 core is replicated on the slave?

 replication does not handle new core creation. You will have to issue
 the core creation command to each slave separately.

 I went over the wiki, http://wiki.apache.org/solr/SolrReplication,
 and few questions related to that,

 1) How do we replicate solr.xml where we have list of cores? Wiki
 says, Only files in the 'conf' dir of solr instance is replicated. 
 - since, solr.xml is in the home directory how do we replicate that?
 solr.xml canot be replicated. even if you did it is not reloaded.

 2) Solrconfig.xml in slave takes a static core url,

    str 
 name=masterUrlhttp://localhost:port/solr/corename/replication/str

 put a placeholder like
 str 
 name=masterUrlhttp://localhost:port/solr/${solr.core.name}/replication/str
 so the corename is automatically replaced


 As in our case cores are created dynamically (new core created after
 the active one reaches some capacity), how can we define master core
 dynamically for replication? The only I see it is using fetchIndex
 command and passing new core info there - is it right? If so, does the
 slave application have write code to poll Master periodically and fire
 fetchIndex command, but how would Slave know the Master corename -
 as they are created dynamically on the Master?

 Thanks,
 -vivek




 --
 -
 Noble Paul | Principal Engineer| AOL | http://aol.com




-- 
Lici


Re: CorruptIndexException: Unknown format version

2009-08-19 Thread Licinio Fernández Maurelo
It looks like your solr lucene-core version doesn't match with the
lucene version used to generate the index, as Yonik said, looks like
there is a lucene library conflict.

2009/8/19 Chris Hostetter hossman_luc...@fucit.org:

 : how can that happen, it is a new index, and it is already corrupt?
 :
 : Did anybody else something like this?

 Unknown format version doesn't mean your index is corrupt .. it means
 the version of LUcnee parsing the index doesn't recognize the index format
 version ... typically it means you are trying to open an index generated
 by a newer version of lucene then the one you are using.




 -Hoss





-- 
Lici


Adding cores dynamically

2009-08-19 Thread Licinio Fernández Maurelo
Hi there,

currently we want to add cores dynamically when the active one reaches
some capacity,
can anyone give me some hints to achieve such this functionality? (Just
wondering if you have used shell-scripting or you have code some 100%
Java based solution)

Thx


-- 
Lici


Re: Replication over multi-core solr

2009-08-19 Thread Licinio Fernández Maurelo
Ok

2009/8/19 vivek sar vivex...@gmail.com:
 Licinio,

  Please open a separate thread - as it's a different issue - and I can
 respond there.

 -vivek

 2009/8/19 Licinio Fernández Maurelo licinio.fernan...@gmail.com:
 Hi Vivek,
 currently we want to add cores dynamically when the active one reaches
 some capacity,
 can you give me some hints to achieve such this functionality? (Just
 wondering if you have used shell-scripting or you have code some 100%
 Java based solution)

 Thx


 2009/8/19 Noble Paul നോബിള്‍  नोब्ळ् noble.p...@corp.aol.com:
 On Wed, Aug 19, 2009 at 2:27 AM, vivek sarvivex...@gmail.com wrote:
 Hi,

  We use multi-core setup for Solr, where new cores are added
 dynamically to solr.xml. Only one core is active at a time. My
 question is how can the replication be done for multi-core - so every
 core is replicated on the slave?

 replication does not handle new core creation. You will have to issue
 the core creation command to each slave separately.

 I went over the wiki, http://wiki.apache.org/solr/SolrReplication,
 and few questions related to that,

 1) How do we replicate solr.xml where we have list of cores? Wiki
 says, Only files in the 'conf' dir of solr instance is replicated. 
 - since, solr.xml is in the home directory how do we replicate that?
 solr.xml canot be replicated. even if you did it is not reloaded.

 2) Solrconfig.xml in slave takes a static core url,

    str 
 name=masterUrlhttp://localhost:port/solr/corename/replication/str

 put a placeholder like
 str 
 name=masterUrlhttp://localhost:port/solr/${solr.core.name}/replication/str
 so the corename is automatically replaced


 As in our case cores are created dynamically (new core created after
 the active one reaches some capacity), how can we define master core
 dynamically for replication? The only I see it is using fetchIndex
 command and passing new core info there - is it right? If so, does the
 slave application have write code to poll Master periodically and fire
 fetchIndex command, but how would Slave know the Master corename -
 as they are created dynamically on the Master?

 Thanks,
 -vivek




 --
 -
 Noble Paul | Principal Engineer| AOL | http://aol.com




 --
 Lici





-- 
Lici


Re: Spanish Stemmer

2009-08-19 Thread Licinio Fernández Maurelo
Hi, take a look at this:

!-- Tipo de campo para Textos (con stemming en español) --
fieldtype name=textTypeWithStemming class=solr.TextField
  analyzer type=index
tokenizer class=solr.WhitespaceTokenizerFactory/
filter class=solr.WordDelimiterFilterFactory
generateWordParts=1 generateNumberParts=1 catenateWords=1
catenateNumbers=1 catenateAll=0 splitOnCaseChange=1/
filter class=solr.LowerCaseFilterFactory/
filter class=solr.StopFilterFactory ignoreCase=true
words=stopwords.txt enablePositionIncrements=true/
filter class=solr.SnowballPorterFilterFactory language=Spanish/
  /analyzer
  analyzer type=query
tokenizer class=solr.WhitespaceTokenizerFactory/
filter class=solr.WordDelimiterFilterFactory
generateWordParts=1 generateNumberParts=1 catenateWords=1
catenateNumbers=1 catenateAll=0 splitOnCaseChange=1/
filter class=solr.LowerCaseFilterFactory/
filter class=solr.StopFilterFactory ignoreCase=true
words=stopwords.txt enablePositionIncrements=true/
filter class=solr.SnowballPorterFilterFactory language=Spanish/
  /analyzer
/fieldtype

Un saludo

2009/8/19 Robert Muir rcm...@gmail.com:
 hi, it looks like you might just have a simple typo:

  filter class=solr.SnowballPorterFilterFactory languange=Spanish/

 if you change it to language=Spanish it should work.


 --
 Robert Muir
 rcm...@gmail.com




-- 
Lici


Aliases for fields

2009-08-18 Thread Licinio Fernández Maurelo
Hello everybody,

can i set an alias for a field? Something like :

field name=sourceDate type=uniqueIdType indexed=true
stored=true multiValued=false termVectors=false
alias=source.date/

is there any jira issue related?

Thx

-- 
Lici


Re: Aliases for fields

2009-08-18 Thread Licinio Fernández Maurelo
Currently we are trying to unmarshall objets from the index (solr bean
tags didn't fully acomplish this issue in our project due to model
complexity).
It will be nice to set an alias for some fields to match the pojo.property name.
Don't know if there is an alternative (maybe copyfield?)  to implement
this beahaviour

thanks

2009/8/18 Avlesh Singh avl...@gmail.com:
 What could possibly be a use case for such a need?

 Cheers
 Avlesh

 2009/8/18 Licinio Fernández Maurelo licinio.fernan...@gmail.com

 Hello everybody,

 can i set an alias for a field? Something like :

 field name=sourceDate type=uniqueIdType indexed=true
 stored=true multiValued=false termVectors=false
 alias=source.date/

 is there any jira issue related?

 Thx

 --
 Lici





-- 
Lici


Re: How can i get lucene index format version information?

2009-08-18 Thread Licinio Fernández Maurelo
Nobody knoes how can i get exactly this info : index format : -9 (UNKNOWN)

Despite of knowing str name=lucene-impl-version2.9-dev 794238 -
2009-07-15 18:05:08/str helps, i assume that it doesn't implies  an
index format change

Am i wrong?

El 11 de agosto de 2009 11:53, Licinio Fernández
Maurelolicinio.fernan...@gmail.com escribió:
 Thanks all for your responses,

 what i expect to get is the index format version as it appears in
 luke's overview  tab (index format : -9 (UNKNOWN)

 2009/7/31 Jay Hill jayallenh...@gmail.com:
 Check the system request handler: http://localhost:8983/solr/admin/system

 Should look something like this:
 lst name=lucene
 str name=solr-spec-version1.3.0.2009.07.28.10.39.42/str
 str name=solr-impl-version1.4-dev 797693M - jayhill - 2009-07-28
 10:39:42/str
 str name=lucene-spec-version2.9-dev/str
 str name=lucene-impl-version2.9-dev 794238 - 2009-07-15 18:05:08/str
 /lst

 -Jay


 On Thu, Jul 30, 2009 at 10:32 AM, Walter Underwood 
 wun...@wunderwood.orgwrote:

 I think the properties page in the admin UI lists the Lucene version, but I
 don't have a live server to check that on at this instant.

 wunder


 On Jul 30, 2009, at 10:26 AM, Chris Hostetter wrote:


 :  i want to get the lucene index format version from solr web app (as

 : the Luke request handler writes it out:
 :
 :    indexInfo.add(version, reader.getVersion());

 that's the index version (as in i have added docs to the index, so the
 version number has changed) the question is about the format version (as
 in: i have upgraded Lucene from 2.1 to 2.3, so the index format has
 changed)

 I'm not sure how Luke get's that ... it's not exposed via a public API on
 an IndexReader.

 Hmm...  SegmentInfos.readCurrentVersion(Directory) seems like it would do
 the trick; but i'm not sure how that would interact with customized
 INdexReader implementations.  i suppose we could always make it non-fatal
 if it throws an exception (just print the exception mesg in place of hte
 number)

 anybody want to submit a patch to add this to the LukeRequestHandler?


 -Hoss







 --
 Lici




-- 
Lici


Re: Aliases for fields

2009-08-18 Thread Licinio Fernández Maurelo
Our purpose is to reuse the data stored in our indexes  serving  it to
multiple format clients (xml, php, json) directly (no mapper tier
wanted).

As clients model entities names doesn't match index field names, we
want to use alias in some way to adapt the response for the client.

Taking a look at solr wiki found this:

This (CopyField) is provided as a convenient way to ensure that data
is put into several fields, without needing to include the data in the
update command multiple times

i want to perform this behaviour in read-only mode (don't want to
duplicate data)

Thx


2009/8/18 Avlesh Singh avl...@gmail.com:

 solr bean tags didn't fully acomplish this issue in our project due to
 model complexity

 Did you try annotating your pojo in this manner?
 @Field(index_field_name)
 pojoPropertyName;

 It will be nice to set an alias for some fields to match the pojo.property
 name. Don't know if there is an alternative (maybe copyfield?) to implement
 this beahaviour

 Though I am not sure what you want to achieve, yet a copyField is very
 similar to what you are asking for.

 Cheers
 Avlesh

 2009/8/18 Licinio Fernández Maurelo licinio.fernan...@gmail.com

 Currently we are trying to unmarshall objets from the index (solr bean
 tags didn't fully acomplish this issue in our project due to model
 complexity).
 It will be nice to set an alias for some fields to match the pojo.property
 name.
 Don't know if there is an alternative (maybe copyfield?)  to implement
 this beahaviour

 thanks

 2009/8/18 Avlesh Singh avl...@gmail.com:
  What could possibly be a use case for such a need?
 
  Cheers
  Avlesh
 
  2009/8/18 Licinio Fernández Maurelo licinio.fernan...@gmail.com
 
  Hello everybody,
 
  can i set an alias for a field? Something like :
 
  field name=sourceDate type=uniqueIdType indexed=true
  stored=true multiValued=false termVectors=false
  alias=source.date/
 
  is there any jira issue related?
 
  Thx
 
  --
  Lici
 
 



 --
 Lici





-- 
Lici


Index health checking

2009-08-18 Thread Licinio Fernández Maurelo
As you suppose, i'm asking if currently solr implements this
functionality or there is any related jira issue.

A few days ago, our solr server suffered an unsafe power shutdown.
After restoring, we found wrong behaviour (we got NullPointerException
when aplying sort criteria in some queries) due to index corruption

I think it could be nice if Solr could check if the index is corrupted
(fix corrupted indexes too)

Found this functionality at :

 http://lucene.apache.org/java/2_4_0/api/org/apache/lucene/index/CheckIndex.html

Thx
-- 
Lici


Re: How can i get lucene index format version information?

2009-08-11 Thread Licinio Fernández Maurelo
Thanks all for your responses,

what i expect to get is the index format version as it appears in
luke's overview  tab (index format : -9 (UNKNOWN)

2009/7/31 Jay Hill jayallenh...@gmail.com:
 Check the system request handler: http://localhost:8983/solr/admin/system

 Should look something like this:
 lst name=lucene
 str name=solr-spec-version1.3.0.2009.07.28.10.39.42/str
 str name=solr-impl-version1.4-dev 797693M - jayhill - 2009-07-28
 10:39:42/str
 str name=lucene-spec-version2.9-dev/str
 str name=lucene-impl-version2.9-dev 794238 - 2009-07-15 18:05:08/str
 /lst

 -Jay


 On Thu, Jul 30, 2009 at 10:32 AM, Walter Underwood 
 wun...@wunderwood.orgwrote:

 I think the properties page in the admin UI lists the Lucene version, but I
 don't have a live server to check that on at this instant.

 wunder


 On Jul 30, 2009, at 10:26 AM, Chris Hostetter wrote:


 :  i want to get the lucene index format version from solr web app (as

 : the Luke request handler writes it out:
 :
 :    indexInfo.add(version, reader.getVersion());

 that's the index version (as in i have added docs to the index, so the
 version number has changed) the question is about the format version (as
 in: i have upgraded Lucene from 2.1 to 2.3, so the index format has
 changed)

 I'm not sure how Luke get's that ... it's not exposed via a public API on
 an IndexReader.

 Hmm...  SegmentInfos.readCurrentVersion(Directory) seems like it would do
 the trick; but i'm not sure how that would interact with customized
 INdexReader implementations.  i suppose we could always make it non-fatal
 if it throws an exception (just print the exception mesg in place of hte
 number)

 anybody want to submit a patch to add this to the LukeRequestHandler?


 -Hoss







-- 
Lici


How can i get lucene index format version information?

2009-07-30 Thread Licinio Fernández Maurelo
 i want to get the lucene index format version from solr web app (as
luke do), i've tried looking for the info at luke handler response,
but i havn't found this info

-- 
Lici


Re: FieldCollapsing: Two response elements returned?

2009-07-29 Thread Licinio Fernández Maurelo
I've applied latest collapse field related patch (patch-3) and it doesn't work.
Anyone knows how can i get only the collapse response ?


29-jul-2009 11:05:21 org.apache.solr.common.SolrException log
GRAVE: java.lang.ClassCastException:
org.apache.solr.handler.component.CollapseComponent cannot be cast to
org.apache.solr.request.SolrRequestHandler
at 
org.apache.solr.core.RequestHandlers.initHandlersFromConfig(RequestHandlers.java:150)
at org.apache.solr.core.SolrCore.init(SolrCore.java:539)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:381)
at org.apache.solr.core.CoreContainer.load(CoreContainer.java:241)
at 
org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:115)
at 
org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:83)
at 
org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:275)
at 
org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:397)
at 
org.apache.catalina.core.ApplicationFilterConfig.init(ApplicationFilterConfig.java:108)
at 
org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:3800)
at 
org.apache.catalina.core.StandardContext.start(StandardContext.java:4450)
at 
org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:791)
at 
org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:771)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:526)
at 
org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:987)
at 
org.apache.catalina.startup.HostConfig.deployDirectories(HostConfig.java:909)
at 
org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:495)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1206)
at 
org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:314)
at 
org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:119)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1053)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:722)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
at 
org.apache.catalina.core.StandardEngine.start(StandardEngine.java:443)
at 
org.apache.catalina.core.StandardService.start(StandardService.java:516)
at 
org.apache.catalina.core.StandardServer.start(StandardServer.java:710)
at org.apache.catalina.startup.Catalina.start(Catalina.java:583)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:288)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:413)

2009/7/28 Marc Sturlese marc.sturl...@gmail.com:

 That's provably because you are using both the CollpaseComponent and the
 QueryComponent. I think the 2 or 3 last patches allow full replacement of
 QueryComponent.You shoud just replace:

 searchComponent name=query
 class=org.apache.solr.handler.component.QueryComponent /
 for:
 searchComponent name=query
 class=org.apache.solr.handler.component.CollapseComponent /

 This will sort your problem and make response times faster.



 Jay Hill wrote:

 I'm doing some testing with field collapsing, and early results look good.
 One thing seems odd to me however. I would expect to get back one block of
 results, but I get two - the first one contains the collapsed results, the
 second one contains the full non-collapsed results:

 result name=response numFound=11 start=0 ... /result
 result name=response numFound=62 start=0 ... /result

 This seems somewhat confusing. Is this intended or is this a bug?

 Thanks,
 -Jay



 --
 View this message in context: 
 http://www.nabble.com/FieldCollapsing%3A-Two-response-elements-returned--tp24690426p24693960.html
 Sent from the Solr - User mailing list archive at Nabble.com.





-- 
Lici


Re: FieldCollapsing: Two response elements returned?

2009-07-29 Thread Licinio Fernández Maurelo
My last mail is wrong. Sorry

El 29 de julio de 2009 11:10, Licinio Fernández
Maurelolicinio.fernan...@gmail.com escribió:
 I've applied latest collapse field related patch (patch-3) and it doesn't 
 work.
 Anyone knows how can i get only the collapse response ?


 29-jul-2009 11:05:21 org.apache.solr.common.SolrException log
 GRAVE: java.lang.ClassCastException:
 org.apache.solr.handler.component.CollapseComponent cannot be cast to
 org.apache.solr.request.SolrRequestHandler
        at 
 org.apache.solr.core.RequestHandlers.initHandlersFromConfig(RequestHandlers.java:150)
        at org.apache.solr.core.SolrCore.init(SolrCore.java:539)
        at org.apache.solr.core.CoreContainer.create(CoreContainer.java:381)
        at org.apache.solr.core.CoreContainer.load(CoreContainer.java:241)
        at 
 org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:115)
        at 
 org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:83)
        at 
 org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:275)
        at 
 org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:397)
        at 
 org.apache.catalina.core.ApplicationFilterConfig.init(ApplicationFilterConfig.java:108)
        at 
 org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:3800)
        at 
 org.apache.catalina.core.StandardContext.start(StandardContext.java:4450)
        at 
 org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:791)
        at 
 org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:771)
        at 
 org.apache.catalina.core.StandardHost.addChild(StandardHost.java:526)
        at 
 org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:987)
        at 
 org.apache.catalina.startup.HostConfig.deployDirectories(HostConfig.java:909)
        at 
 org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:495)
        at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1206)
        at 
 org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:314)
        at 
 org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:119)
        at 
 org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1053)
        at org.apache.catalina.core.StandardHost.start(StandardHost.java:722)
        at 
 org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
        at 
 org.apache.catalina.core.StandardEngine.start(StandardEngine.java:443)
        at 
 org.apache.catalina.core.StandardService.start(StandardService.java:516)
        at 
 org.apache.catalina.core.StandardServer.start(StandardServer.java:710)
        at org.apache.catalina.startup.Catalina.start(Catalina.java:583)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at 
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:288)
        at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:413)

 2009/7/28 Marc Sturlese marc.sturl...@gmail.com:

 That's provably because you are using both the CollpaseComponent and the
 QueryComponent. I think the 2 or 3 last patches allow full replacement of
 QueryComponent.You shoud just replace:

 searchComponent name=query
 class=org.apache.solr.handler.component.QueryComponent /
 for:
 searchComponent name=query
 class=org.apache.solr.handler.component.CollapseComponent /

 This will sort your problem and make response times faster.



 Jay Hill wrote:

 I'm doing some testing with field collapsing, and early results look good.
 One thing seems odd to me however. I would expect to get back one block of
 results, but I get two - the first one contains the collapsed results, the
 second one contains the full non-collapsed results:

 result name=response numFound=11 start=0 ... /result
 result name=response numFound=62 start=0 ... /result

 This seems somewhat confusing. Is this intended or is this a bug?

 Thanks,
 -Jay



 --
 View this message in context: 
 http://www.nabble.com/FieldCollapsing%3A-Two-response-elements-returned--tp24690426p24693960.html
 Sent from the Solr - User mailing list archive at Nabble.com.





 --
 Lici




-- 
Lici


facet.prefix question

2009-07-23 Thread Licinio Fernández Maurelo
i'm trying to do some filtering in the count list retrieved by solr when
doing a faceting query ,

i'm wondering how can i use facet.prefix to gem something like this:

Query

facet.field=foofacet.prefix=A OR B

Response

lst name=facet_fields
-
lst name=foo
int name=A12560/int
int name=A*5440/int
int name=B**2357/int
.
.
.
/lst



How can i achieve such this behaviour?

Best Regards

-- 
Lici


spell checker's collate values

2009-07-07 Thread Licinio Fernández Maurelo
Hi all,
i'm still trying to tune my spellchecker to get the results i expect
I've created a dictionary and currently i want to get an special behaviour
from the spellchecker.
The fact is that  when i introduce  the query 'Fernandox Alonso' i get what
i expect :

bool name=correctlySpelledfalse/bool
str name=collationFernando Alonso/str

but when i try 'Fernanda Alonso' its returns

lst name=spellcheck
-
lst name=suggestions
bool name=correctlySpelledtrue/bool
/lst
/lst

ok, Fernanda is a correct name, but i whant to boost some kind of values
(Fernado Alonso, Michael Jackson)
to be returned as suggestions. (as google do)

Any help?

regards
-- 
Lici


Creating spellchecker dictionary from multiple sources

2009-07-02 Thread Licinio Fernández Maurelo
Hello everybody, dealing with the spell checker component i'm wondering if
it's possible to generate my dictionary index based on multiple indexes
fields  and also want to  know how anyone has solve this problem.

Thx

-- 
Lici


Re: Creating spellchecker dictionary from multiple sources

2009-07-02 Thread Licinio Fernández Maurelo
Thanks for your responses guys,

my problem is that currently we have 11 cores-index, some of them contains
fields i want to use for
spell checking and i'm thinking on build an extra-core containing the
dictionary index, and import from multiple indexes the information i need
via DIH.

Should it works, i hope

2009/7/2 Erik Hatcher e...@ehatchersolutions.com

 You could configure multiple spellcheckers on different fields, or if you
 want to aggregate several fields into the suggestions, use copyField to pool
 all text to be suggested together into a single field.

Erik


 On Jul 2, 2009, at 7:46 AM, Otis Gospodnetic wrote:


 Hi Lici,

 I don't think the current spellchecker can look at more than one field,
 let alone multiple indices, but you could certainly modify the code and make
 it do that.  Looking at multiple fields of the same index may make more
 sense than looking at multiple indices.

 Otis
 --
 Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch



 - Original Message 

 From: Licinio Fernández Maurelo licinio.fernan...@gmail.com
 To: solr-user@lucene.apache.org
 Sent: Thursday, July 2, 2009 5:36:34 AM
 Subject: Creating spellchecker dictionary from multiple sources

 Hello everybody, dealing with the spell checker component i'm wondering
 if
 it's possible to generate my dictionary index based on multiple indexes
 fields  and also want to  know how anyone has solve this problem.

 Thx

 --
 Lici






-- 
Lici