Re: What are the basic requirements for on-the-fly registration/creation of new Core?

2009-05-20 Thread KK
I downloaded the nightly build and deployed the dist/solr-nightly-build.war
in tomcat with all other config same @solr.home. Trying to create a new core
gave me the same error. Do I still need to apply the patch you mentioned
earlier. I'm stuck, please help me out.
I used this to create a new core,
http://localhost:8080/solr/admin/cores?action=CREATEname=core2instanceDir=/opt/solr/

my solr home is /opt/solr which has a conf directory containing solrconfig
and schema files.

Thanks,
KK

*message* *No system property or default value specified for
solr.core.nameorg.apache.solr.common.SolrException: No system property
or default value
specified for solr.core.name at
org.apache.solr.common.util.DOMUtil.substituteProperty(DOMUtil.java:311) at
org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:264)
at
org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:272)
at
org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:272)
at org.apache.solr.core.Config.init(Config.java:105) at
org.apache.solr.core.SolrConfig.init(SolrConfig.java:113) at
org.apache.solr.core.CoreContainer.create(CoreContainer.java:321) at
org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:107)
at
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:1204) at
org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:303)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:232)
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:215)
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188)
at
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213)
at
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:172)
at
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:117)
at
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:108)
at
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:174)
at
org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:875)
at
org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:665)
at
org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:528)
at
org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:81)
at
org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:689)
at java.lang.Thread.run(Thread.java:619) *

*description* *The server encountered an internal error (No system property
or default value specified for
solr.core.nameorg.apache.solr.common.SolrException: No system property
or default value
specified for solr.core.name at
org.apache.solr.common.util.DOMUtil.substituteProperty(DOMUtil.java:311) at
org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:264)
at
org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:272)
at
org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:272)
at org.apache.solr.core.Config.init(Config.java:105) at
org.apache.solr.core.SolrConfig.init(SolrConfig.java:113) at
org.apache.solr.core.CoreContainer.create(CoreContainer.java:321) at
org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:107)
at
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:1204) at
org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:303)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:232)
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:215)
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188)
at
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213)
at
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:172)
at
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:117)
at
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:108)
at
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:174)
at
org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:875)
at
org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:665)
at
org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:528)
at
org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:81)
at

Help needed on DataImportHandler to index xml files

2009-05-20 Thread Jianbin Dai

Hi All,
I am new here. Thanks for reading my question.
I want to use DataImportHandler to index my tons of xml files (7GB total) 
stored in my local disk. My data-config.xml is attached below. It works fine 
with one file (abc.xml), but how can I index all xml files at one time? Thanks!


dataConfig
dataSource type=FileDataSource /
document
entity name=example
url=/root/abc.xml
processor=XPathEntityProcessor
forEach=/ShopzillaQueryResponse/product
transformer=DateFormatTransformer

field column=id  
xpath=/ShopzillaQueryResponse/product/id /
field column=name
xpath=/ShopzillaQueryResponse/product/name /
field column=sku 
xpath=/ShopzillaQueryResponse/product/sku /
field column=mydescription  
xpath=/ShopzillaQueryResponse/product/desc_short /
field column=price  
xpath=/ShopzillaQueryResponse/product/merchantListing/merchantProduct/price /

/entity
/document
/dataConfig



  


Re: What are the basic requirements for on-the-fly registration/creation of new Core?

2009-05-20 Thread KK
Please do so ASAP. I'm stuck just becaue of that. If not then I might have
to move to lucene. I guess lucene has support for on the fly creation of new
indexes. Any idea on this?

Thanks,
KK.

2009/5/20 Noble Paul നോബിള്‍ नोब्ळ् noble.p...@corp.aol.com

 hi KK,
 the fix got removed in a subsequent refactoring I guess. I shall check
 in soon and you can get it from the trunk.

 On Wed, May 20, 2009 at 11:29 AM, KK dioxide.softw...@gmail.com wrote:
  I downloaded the nightly build and deployed the
 dist/solr-nightly-build.war
  in tomcat with all other config same @solr.home. Trying to create a new
 core
  gave me the same error. Do I still need to apply the patch you mentioned
  earlier. I'm stuck, please help me out.
  I used this to create a new core,
 
 http://localhost:8080/solr/admin/cores?action=CREATEname=core2instanceDir=/opt/solr/
 
  my solr home is /opt/solr which has a conf directory containing
 solrconfig
  and schema files.
 
  Thanks,
  KK
 
  message No system property or default value specified for solr.core.name
  org.apache.solr.common.SolrException: No system property or default value
  specified for solr.core.name at
  org.apache.solr.common.util.DOMUtil.substituteProperty(DOMUtil.java:311)
 at
 
 org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:264)
  at
 
 org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:272)
  at
 
 org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:272)
  at org.apache.solr.core.Config.init(Config.java:105) at
  org.apache.solr.core.SolrConfig.init(SolrConfig.java:113) at
  org.apache.solr.core.CoreContainer.create(CoreContainer.java:321) at
 
 org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:107)
  at
 
 org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)
  at org.apache.solr.core.SolrCore.execute(SolrCore.java:1204) at
 
 org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:303)
  at
 
 org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:232)
  at
 
 org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:215)
  at
 
 org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188)
  at
 
 org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213)
  at
 
 org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:172)
  at
 
 org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
  at
 
 org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:117)
  at
 
 org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:108)
  at
 
 org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:174)
  at
 
 org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:875)
  at
 
 org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:665)
  at
 
 org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:528)
  at
 
 org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:81)
  at
 
 org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:689)
  at java.lang.Thread.run(Thread.java:619)
 
  description The server encountered an internal error (No system property
 or
  default value specified for solr.core.name
  org.apache.solr.common.SolrException: No system property or default value
  specified for solr.core.name at
  org.apache.solr.common.util.DOMUtil.substituteProperty(DOMUtil.java:311)
 at
 
 org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:264)
  at
 
 org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:272)
  at
 
 org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:272)
  at org.apache.solr.core.Config.init(Config.java:105) at
  org.apache.solr.core.SolrConfig.init(SolrConfig.java:113) at
  org.apache.solr.core.CoreContainer.create(CoreContainer.java:321) at
 
 org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:107)
  at
 
 org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)
  at org.apache.solr.core.SolrCore.execute(SolrCore.java:1204) at
 
 org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:303)
  at
 
 org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:232)
  at
 
 org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:215)
  at
 
 org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188)
  at
 
 org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213)
  at
 
 org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:172)
  at
 
 

Re: What are the basic requirements for on-the-fly registration/creation of new Core?

2009-05-20 Thread KK
I downloaded the nightly build . How do I get the trunk and where to get it
from? I've never used trunks earlier . Please help me out. Tell me the
detailed steps for all this.

Thanks,
KK.

2009/5/20 Noble Paul നോബിള്‍ नोब्ळ् noble.p...@corp.aol.com

 if you are using a trunk version you can still use the dataDir
 attribute while creating the core. so you are not really stuck

 I have committed the fix

 On Wed, May 20, 2009 at 11:45 AM, KK dioxide.softw...@gmail.com wrote:
  Please do so ASAP. I'm stuck just becaue of that. If not then I might
 have
  to move to lucene. I guess lucene has support for on the fly creation of
 new
  indexes. Any idea on this?
 
  Thanks,
  KK.
 
  2009/5/20 Noble Paul നോബിള്‍ नोब्ळ् noble.p...@corp.aol.com
 
  hi KK,
  the fix got removed in a subsequent refactoring I guess. I shall check
  in soon and you can get it from the trunk.
 
  On Wed, May 20, 2009 at 11:29 AM, KK dioxide.softw...@gmail.com
 wrote:
   I downloaded the nightly build and deployed the
   dist/solr-nightly-build.war
   in tomcat with all other config same @solr.home. Trying to create a
 new
   core
   gave me the same error. Do I still need to apply the patch you
 mentioned
   earlier. I'm stuck, please help me out.
   I used this to create a new core,
  
  
 http://localhost:8080/solr/admin/cores?action=CREATEname=core2instanceDir=/opt/solr/
  
   my solr home is /opt/solr which has a conf directory containing
   solrconfig
   and schema files.
  
   Thanks,
   KK
  
   message No system property or default value specified for
 solr.core.name
   org.apache.solr.common.SolrException: No system property or default
   value
   specified for solr.core.name at
  
 org.apache.solr.common.util.DOMUtil.substituteProperty(DOMUtil.java:311)
   at
  
  
 org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:264)
   at
  
  
 org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:272)
   at
  
  
 org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:272)
   at org.apache.solr.core.Config.init(Config.java:105) at
   org.apache.solr.core.SolrConfig.init(SolrConfig.java:113) at
   org.apache.solr.core.CoreContainer.create(CoreContainer.java:321) at
  
  
 org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:107)
   at
  
  
 org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)
   at org.apache.solr.core.SolrCore.execute(SolrCore.java:1204) at
  
  
 org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:303)
   at
  
  
 org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:232)
   at
  
  
 org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:215)
   at
  
  
 org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188)
   at
  
  
 org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213)
   at
  
  
 org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:172)
   at
  
  
 org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
   at
  
  
 org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:117)
   at
  
  
 org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:108)
   at
  
  
 org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:174)
   at
  
  
 org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:875)
   at
  
  
 org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:665)
   at
  
  
 org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:528)
   at
  
  
 org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:81)
   at
  
  
 org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:689)
   at java.lang.Thread.run(Thread.java:619)
  
   description The server encountered an internal error (No system
 property
   or
   default value specified for solr.core.name
   org.apache.solr.common.SolrException: No system property or default
   value
   specified for solr.core.name at
  
 org.apache.solr.common.util.DOMUtil.substituteProperty(DOMUtil.java:311)
   at
  
  
 org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:264)
   at
  
  
 org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:272)
   at
  
  
 org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:272)
   at org.apache.solr.core.Config.init(Config.java:105) at
   org.apache.solr.core.SolrConfig.init(SolrConfig.java:113) at
   org.apache.solr.core.CoreContainer.create(CoreContainer.java:321) at
  
  
 org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:107)
   at
  
  
 org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)
   at 

query clause and filter query

2009-05-20 Thread Ashish P

what is the difference between query clause and filter query??
Thanks,
Ashish
-- 
View this message in context: 
http://www.nabble.com/query-clause-and-filter-query-tp23629715p23629715.html
Sent from the Solr - User mailing list archive at Nabble.com.



Re: Help needed on DataImportHandler to index xml files

2009-05-20 Thread Noble Paul നോബിള്‍ नोब्ळ्
there is an example given for the same

http://wiki.apache.org/solr/DataImportHandler#head-8edbc7e588d97068aa0a61ed2e3e8e61b43debce



On Wed, May 20, 2009 at 11:35 AM, Jianbin Dai djian...@yahoo.com wrote:

 Hi All,
 I am new here. Thanks for reading my question.
 I want to use DataImportHandler to index my tons of xml files (7GB total) 
 stored in my local disk. My data-config.xml is attached below. It works fine 
 with one file (abc.xml), but how can I index all xml files at one time? 
 Thanks!


 dataConfig
        dataSource type=FileDataSource /
        document
                entity name=example
                        url=/root/abc.xml
                        processor=XPathEntityProcessor
                        forEach=/ShopzillaQueryResponse/product
                        transformer=DateFormatTransformer

                        field column=id  
 xpath=/ShopzillaQueryResponse/product/id /
                        field column=name        
 xpath=/ShopzillaQueryResponse/product/name /
                        field column=sku         
 xpath=/ShopzillaQueryResponse/product/sku /
                        field column=mydescription  
 xpath=/ShopzillaQueryResponse/product/desc_short /
                        field column=price      
 xpath=/ShopzillaQueryResponse/product/merchantListing/merchantProduct/price 
 /

                /entity
        /document
 /dataConfig








-- 
-
Noble Paul | Principal Engineer| AOL | http://aol.com


Re: Solr statistics of top searches and results returned

2009-05-20 Thread Shalin Shekhar Mangar
On Tue, May 19, 2009 at 11:50 PM, solrpowr solrp...@hotmail.com wrote:


 Besides my own offline processing via logs, does solr have the
 functionality
 to give me statistics such as top searches, how many results were returned
 on these searches, and/or how long it took to get these results on average.


You can see the statistics page (see the /select section) which will tell
you average queries per second and average time per query.

There's no option to show top searches as of now.

-- 
Regards,
Shalin Shekhar Mangar.


Re: What are the basic requirements for on-the-fly registration/creation of new Core?

2009-05-20 Thread KK
Do I've to check out this
http://svn.apache.org/repos/asf/lucene/solr/trunk/
using this command
svn checkout http://svn.apache.org/repos/asf/lucene/solr/trunk/
??? not much idea about svn chekout from public repo. give me some specific
pointers on this.

and then fire ant to build a fresh solr? Will that create the war to be
deployed or I've to create the war file for deployment? please give me
proper directions.

Thanks,
KK.

2009/5/20 Noble Paul നോബിള്‍ नोब्ळ् noble.p...@corp.aol.com

 you will have to wait another 3 hours for the next nightly build to happen.

 you can of course check out the trunk and build one your own

 see this

 http://lucene.apache.org/solr/version_control.html#Anonymous+Access+(read-only)http://lucene.apache.org/solr/version_control.html#Anonymous+Access+%28read-only%29

 now you may need ant to build this.

 On Wed, May 20, 2009 at 11:53 AM, KK dioxide.softw...@gmail.com wrote:
  I downloaded the nightly build . How do I get the trunk and where to get
 it
  from? I've never used trunks earlier . Please help me out. Tell me the
  detailed steps for all this.
 
  Thanks,
  KK.
 
  2009/5/20 Noble Paul നോബിള്‍ नोब्ळ् noble.p...@corp.aol.com
 
  if you are using a trunk version you can still use the dataDir
  attribute while creating the core. so you are not really stuck
 
  I have committed the fix
 
  On Wed, May 20, 2009 at 11:45 AM, KK dioxide.softw...@gmail.com
 wrote:
   Please do so ASAP. I'm stuck just becaue of that. If not then I might
   have
   to move to lucene. I guess lucene has support for on the fly creation
 of
   new
   indexes. Any idea on this?
  
   Thanks,
   KK.
  
   2009/5/20 Noble Paul നോബിള്‍ नोब्ळ् noble.p...@corp.aol.com
  
   hi KK,
   the fix got removed in a subsequent refactoring I guess. I shall
 check
   in soon and you can get it from the trunk.
  
   On Wed, May 20, 2009 at 11:29 AM, KK dioxide.softw...@gmail.com
   wrote:
I downloaded the nightly build and deployed the
dist/solr-nightly-build.war
in tomcat with all other config same @solr.home. Trying to create a
new
core
gave me the same error. Do I still need to apply the patch you
mentioned
earlier. I'm stuck, please help me out.
I used this to create a new core,
   
   
   
 http://localhost:8080/solr/admin/cores?action=CREATEname=core2instanceDir=/opt/solr/
   
my solr home is /opt/solr which has a conf directory containing
solrconfig
and schema files.
   
Thanks,
KK
   
message No system property or default value specified for
solr.core.name
org.apache.solr.common.SolrException: No system property or default
value
specified for solr.core.name at
   
   
 org.apache.solr.common.util.DOMUtil.substituteProperty(DOMUtil.java:311)
at
   
   
   
 org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:264)
at
   
   
   
 org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:272)
at
   
   
   
 org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:272)
at org.apache.solr.core.Config.init(Config.java:105) at
org.apache.solr.core.SolrConfig.init(SolrConfig.java:113) at
org.apache.solr.core.CoreContainer.create(CoreContainer.java:321)
 at
   
   
   
 org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:107)
at
   
   
   
 org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:1204) at
   
   
   
 org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:303)
at
   
   
   
 org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:232)
at
   
   
   
 org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:215)
at
   
   
   
 org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188)
at
   
   
   
 org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213)
at
   
   
   
 org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:172)
at
   
   
   
 org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at
   
   
   
 org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:117)
at
   
   
   
 org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:108)
at
   
   
   
 org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:174)
at
   
   
   
 org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:875)
at
   
   
   
 org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:665)
at
   
   
   
 org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:528)
at
   
   
   
 

Re: What are the basic requirements for on-the-fly registration/creation of new Core?

2009-05-20 Thread KK
OK then , I assume that nightly build will solve my basic problem of On the
fly creation of new cores using dataDir as req parameter, then I can wait
for two more hours.
One more thing the new nightly build willl be called solr-2009-05-20.tgz,
right as teh current one  is solr-2009-05-19.tgz, right?

Thanks,
KK.

2009/5/20 Noble Paul നോബിള്‍ नोब्ळ् noble.p...@corp.aol.com

 if you are not familiar w/ svn and ant I suggest wait till the nightly
 build happens it is just 2 hrs away


 On Wed, May 20, 2009 at 12:13 PM, KK dioxide.softw...@gmail.com wrote:
  Do I've to check out this
  http://svn.apache.org/repos/asf/lucene/solr/trunk/
  using this command
  svn checkout http://svn.apache.org/repos/asf/lucene/solr/trunk/
  ??? not much idea about svn chekout from public repo. give me some
 specific
  pointers on this.
 
  and then fire ant to build a fresh solr? Will that create the war to be
  deployed or I've to create the war file for deployment? please give me
  proper directions.
 
  Thanks,
  KK.
 
  2009/5/20 Noble Paul നോബിള്‍ नोब्ळ् noble.p...@corp.aol.com
 
  you will have to wait another 3 hours for the next nightly build to
  happen.
 
  you can of course check out the trunk and build one your own
 
  see this
 
 
 http://lucene.apache.org/solr/version_control.html#Anonymous+Access+(read-only)http://lucene.apache.org/solr/version_control.html#Anonymous+Access+%28read-only%29
 
  now you may need ant to build this.
 
  On Wed, May 20, 2009 at 11:53 AM, KK dioxide.softw...@gmail.com
 wrote:
   I downloaded the nightly build . How do I get the trunk and where to
 get
   it
   from? I've never used trunks earlier . Please help me out. Tell me the
   detailed steps for all this.
  
   Thanks,
   KK.
  
   2009/5/20 Noble Paul നോബിള്‍ नोब्ळ् noble.p...@corp.aol.com
  
   if you are using a trunk version you can still use the dataDir
   attribute while creating the core. so you are not really stuck
  
   I have committed the fix
  
   On Wed, May 20, 2009 at 11:45 AM, KK dioxide.softw...@gmail.com
   wrote:
Please do so ASAP. I'm stuck just becaue of that. If not then I
 might
have
to move to lucene. I guess lucene has support for on the fly
 creation
of
new
indexes. Any idea on this?
   
Thanks,
KK.
   
2009/5/20 Noble Paul നോബിള്‍ नोब्ळ् noble.p...@corp.aol.com
   
hi KK,
the fix got removed in a subsequent refactoring I guess. I shall
check
in soon and you can get it from the trunk.
   
On Wed, May 20, 2009 at 11:29 AM, KK dioxide.softw...@gmail.com
wrote:
 I downloaded the nightly build and deployed the
 dist/solr-nightly-build.war
 in tomcat with all other config same @solr.home. Trying to
 create
 a
 new
 core
 gave me the same error. Do I still need to apply the patch you
 mentioned
 earlier. I'm stuck, please help me out.
 I used this to create a new core,




 http://localhost:8080/solr/admin/cores?action=CREATEname=core2instanceDir=/opt/solr/

 my solr home is /opt/solr which has a conf directory containing
 solrconfig
 and schema files.

 Thanks,
 KK

 message No system property or default value specified for
 solr.core.name
 org.apache.solr.common.SolrException: No system property or
 default
 value
 specified for solr.core.name at



 org.apache.solr.common.util.DOMUtil.substituteProperty(DOMUtil.java:311)
 at




 org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:264)
 at




 org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:272)
 at




 org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:272)
 at org.apache.solr.core.Config.init(Config.java:105) at
 org.apache.solr.core.SolrConfig.init(SolrConfig.java:113) at

 org.apache.solr.core.CoreContainer.create(CoreContainer.java:321)
 at




 org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:107)
 at




 org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)
 at org.apache.solr.core.SolrCore.execute(SolrCore.java:1204) at




 org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:303)
 at




 org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:232)
 at




 org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:215)
 at




 org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188)
 at




 org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213)
 at




 org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:172)
 at




 

best way to cache base queries (before application of filters)

2009-05-20 Thread Kent Fitch
Hi,  I'm looking for some advice on how to add base query caching to SOLR.

Our use-case for SOLR is:

- a large Lucene index (32M docs, doubling in 6 months, 110GB increasing x 8
in 6 months)
- a frontend which presents views of this data in 5 categories by firing
off 5 queries with the same search term but 5 different fq values

For example, an originating query for sydney harbour generates 5 SOLR
queries:

- ../search?q=complicated expansion of sydney harbourfq=category:books
- ../search?q=complicated expansion of sydney harbourfq=category:maps
- ../search?q=complicated expansion of sydney harbourfq=category:music
etc

The complicated expansion requiring sloppy phrase matches, and the large
database with lots of very large documents means that some queries take
quite some time (10's to several 100's of ms), so we'd like to cache the
results of the base query for a short time (long enough for all related
queries to be issued).

It looks like this isnt the use-case for queryResultCache, because its key
is calculated in SolrIndexSearcher like this:

key = new QueryResultKey(cmd.getQuery(), cmd.getFilterList(), cmd.getSort(),
cmd.getFlags());

That is, the filters are part of the key; and the result that's cached
results reflects the application of the filters, and this works great for
what it is probably designed for - supporting paging through results.

So, I think our options are:

- create a new queryComponent that invokes SolrIndexSearcher differently,
and which has its own (short lived but long entry length) cache of the base
query results

- subclass or change SolrIndexSearcher, perhaps making it pluggable,
perhaps defining an optional new cache of base query results

- create a sublcass of the Lucene IndexSearcher which manages a cache of
query results hidden from SolrIndexSearcher (and organise somehow for
SolrIndexSearcher to use that sublass)

Or perhaps Im taking the wrong approach to this problem entirely!  Any
advice is greatly appreciated.

Kent Fitch


RE: Solr statistics of top searches and results returned

2009-05-20 Thread Plaatje, Patrick
Hi,

At the moment Solr does not have such functionality. I have written a plugin 
for Solr though which uses a second Solr core to store/index the searches. If 
you're interested, send me an email and I'll get you the source for the plugin.

Regards,

Patrick

-Original Message-
From: solrpowr [mailto:solrp...@hotmail.com] 
Sent: dinsdag 19 mei 2009 20:21
To: solr-user@lucene.apache.org
Subject: Solr statistics of top searches and results returned


Hi,

Besides my own offline processing via logs, does solr have the functionality to 
give me statistics such as top searches, how many results were returned on 
these searches, and/or how long it took to get these results on average.


Thanks,
Bob
--
View this message in context: 
http://www.nabble.com/Solr-statistics-of-top-searches-and-results-returned-tp23621779p23621779.html
Sent from the Solr - User mailing list archive at Nabble.com.



Re: How to change the weight of the fields ?

2009-05-20 Thread Vincent Pérès

Hello,

I'm sorry I wrote a mistake, I mean :
http://localhost:8983/solr/listings/select/?q=novelqf=title_s^5.0fl=title_s+isbn_sversion=2.2start=0rows=5indent=ondebugQuery=on
(using qf (Query Fields))

But it seems I need to add dismax as well and configure it by default in
solr config?

Thanks for your answer !

Vincent
-- 
View this message in context: 
http://www.nabble.com/How-to-change-the-weight-of-the-fields---tp23619971p23631022.html
Sent from the Solr - User mailing list archive at Nabble.com.



Re: query clause and filter query

2009-05-20 Thread Andrey Klochkov
Read Consider using filters section here:

http://wiki.apache.org/lucene-java/ImproveSearchingSpeed

On Wed, May 20, 2009 at 10:24 AM, Ashish P ashish.ping...@gmail.com wrote:


 what is the difference between query clause and filter query??
 Thanks,
 Ashish
 --
 View this message in context:
 http://www.nabble.com/query-clause-and-filter-query-tp23629715p23629715.html
 Sent from the Solr - User mailing list archive at Nabble.com.




-- 
Andrew Klochkov


Re: How to retrieve all available Cores in a static way ?

2009-05-20 Thread Andrey Klochkov
AFAIK there's no way of getting it in static way. If you look into
SolrDispatchFilter.java, you'll see this lines:

// put the core container in request attribute
req.setAttribute(org.apache.solr.CoreContainer, cores);

So later in your servlet you can get this request attribute, I do it in this
way.

On Tue, May 19, 2009 at 7:21 PM, Giovanni De Stefano 
giovanni.destef...@gmail.com wrote:

 Hello all,

 I have a quick question but I cannot find a quick answer :-)

 I have a Java client running on the same JVM where Solr is running.

 The Solr I have is a multicore.

 How can I retrieve from the Java client the different cores available?

 I tried with:

 ...
 CoreContainer container = new CoreContainer();
 CollectionSolrCore cores = container.getCores();
 ...

 but I get nothing useful... :-(

 Is there any static method that lets me get this collection?

 Thanks a lot!

 Giovanni




-- 
Andrew Klochkov


Re: Solr statistics of top searches and results returned

2009-05-20 Thread Shalin Shekhar Mangar
On Wed, May 20, 2009 at 1:31 PM, Plaatje, Patrick 
patrick.plaa...@getronics.com wrote:


 At the moment Solr does not have such functionality. I have written a
 plugin for Solr though which uses a second Solr core to store/index the
 searches. If you're interested, send me an email and I'll get you the source
 for the plugin.


Patrick, this will be a useful addition. However instead of doing this with
another core, we can keep running statistics which can be shown on the
statistics page itself. What do you think?

A related approach for showing slow queries was discussed recently. There's
an issue open which has more details:

https://issues.apache.org/jira/browse/SOLR-1101

-- 
Regards,
Shalin Shekhar Mangar.


Re: How to change the weight of the fields ?

2009-05-20 Thread Vincent Pérès

I tried the following request after changed the dismax :
http://localhost:8983/solr/listings/select/?q=novelqt=dismaxqf=title_s^2.0fl=title_s+isbn_sversion=2.2start=0rows=5indent=ondebugQuery=on

But I don't get any results :

lst name=debug
str name=rawquerystringnovel/str
str name=querystringnovel/str
str name=parsedquery+DisjunctionMaxQuery((title_s:novel^2.0)~0.01)
()/str
str name=parsedquery_toString+(title_s:novel^2.0)~0.01 ()/str
lst name=explain/
str name=QParserDismaxQParser/str
null name=altquerystring/
−
arr name=boostfuncs
str

 /str
/arr
−
lst name=timing
double name=time3.0/double
−
lst name=prepare
double name=time1.0/double
−
lst name=org.apache.solr.handler.component.QueryComponent
double name=time1.0/double
/lst
etc


Do I need to add some specific attributes to my string fields?
It looks now like this :
dynamicField name=*_s  type=string  indexed=true  stored=true 
omitNorms=false/
and below :
copyField source=*_s  dest=text/

Thanks !!
Vincent
-- 
View this message in context: 
http://www.nabble.com/How-to-change-the-weight-of-the-fields---tp23619971p23631247.html
Sent from the Solr - User mailing list archive at Nabble.com.



Re: java.lang.RuntimeException: after flush: fdx size mismatch

2009-05-20 Thread Michael McCandless
Hmm... somehow Lucene is flushing a new segment on closing the
IndexWriter, and thinks 1 doc had been added to the stored fields
file, yet the fdx file is the wrong size (0 bytes).  This check (
exception) are designed to prevent corruption from entering the index,
so it's at least good to see CheckIndex passes after this.

I don't think you're hitting LUCENE-1521: that issue only happens if a
single segment has more than ~268 million docs.

Which exact JRE version are you using?

When you hit this exception, is it always 1 docs vs 0 length in bytes?

Mike

On Wed, May 20, 2009 at 3:19 AM, James X
hello.nigerian.spamm...@gmail.com wrote:
 Hello all,I'm running Solr 1.3 in a multi-core environment. There are up to
 2000 active cores in each Solr webapp instance at any given time.

 I've noticed occasional errors such as:
 SEVERE: java.lang.RuntimeException: after flush: fdx size mismatch: 1 docs
 vs 0 length in bytes of _h.fdx
        at
 org.apache.lucene.index.StoredFieldsWriter.closeDocStore(StoredFieldsWriter.java:94)
        at
 org.apache.lucene.index.DocFieldConsumers.closeDocStore(DocFieldConsumers.java:83)
        at
 org.apache.lucene.index.DocFieldProcessor.closeDocStore(DocFieldProcessor.java:47)
        at
 org.apache.lucene.index.DocumentsWriter.closeDocStore(DocumentsWriter.java:367)
        at
 org.apache.lucene.index.DocumentsWriter.flush(DocumentsWriter.java:567)
        at
 org.apache.lucene.index.IndexWriter.doFlush(IndexWriter.java:3540)
        at org.apache.lucene.index.IndexWriter.flush(IndexWriter.java:3450)
        at
 org.apache.lucene.index.IndexWriter.closeInternal(IndexWriter.java:1638)
        at org.apache.lucene.index.IndexWriter.close(IndexWriter.java:1602)
        at org.apache.lucene.index.IndexWriter.close(IndexWriter.java:1578)
        at
 org.apache.solr.update.SolrIndexWriter.close(SolrIndexWriter.java:153)

 during commit / optimise operations.

 These errors then cause cascading errors during updates on the offending
 cores:
 SEVERE: org.apache.lucene.store.LockObtainFailedException: Lock obtain timed
 out: SingleInstanceLock: write.lock
        at org.apache.lucene.store.Lock.obtain(Lock.java:85)
        at org.apache.lucene.index.IndexWriter.init(IndexWriter.java:1070)
        at org.apache.lucene.index.IndexWriter.init(IndexWriter.java:924)
        at
 org.apache.solr.update.SolrIndexWriter.init(SolrIndexWriter.java:116)
        at
 org.apache.solr.update.UpdateHandler.createMainIndexWriter(UpdateHandler.java:122)

 This looks like http://issues.apache.org/jira/browse/LUCENE-1521, but when I
 upgraded Lucene to 2.4.1 under Solr 1.3, the issue still remains.

 CheckIndex doesn't find any problems with the index, and problems disappear
 after an (inconvenient, for me) restart of Solr.

 Firstly, can I as the symptoms are so close to those in 1521, can I check my
 Lucene upgrade method should work:
 - unzip the Solr 1.3 war
 - remove the Lucene 2.4dev jars
 (lucene-core, lucene-spellchecker, lucene-snowball, lucene-queries,
 lucene-memory,lucene-highlighter, lucene-analyzers)
 - move in the Lucene 2.4.1 jars
 - rezip the directory structures as solr.war.

 I think this has worked, as solr/default/admin/registry.jsp shows:
  lucene-spec-version2.4.1/lucene-spec-version
  lucene-impl-version2.4.1 750176 - 2009-03-04
 21:56:52/lucene-impl-version

 Secondly, if this Lucene fix isn't the right solution to this problem, can
 anyone suggest an alternative approach? The only problems I've had up to now
 is to do with the number of allowed file handles, which was fixed by
 changing limits.conf (RHEL machine).

 Many thanks!
 James



Re: What are the basic requirements for on-the-fly registration/creation of new Core?

2009-05-20 Thread KK
Hi Noble,
I downloaded the latest nightly build(20th may) and deployed the
nightly-build.war and tried to run the same thing for creating a new core
and the bad news is that it didn't work. I tried  this
dataDir/opt/solr/data/${core.name}/dataDir also
dataDir/opt/solr/data/${solr.core.name}/dataDir but it gave me the same
error saying

HTTP Status 500 - No system property or default value specified for
core.name org.apache.solr.common.SolrException: No system property or
default value specified for core.name at
org.apache.solr.common.util.DOMUtil.substituteProperty(DOMUtil.java:311) at
...

Can you please help me out. The config files are the old ones that were
there solrconfig.xml and schema.xml withe an entry for data directory as
you've told me earlier. Do I've to pick the new solrconfig files. I'm still
stuck...

Thanks,
KK.

On Wed, May 20, 2009 at 12:59 PM, KK dioxide.softw...@gmail.com wrote:

 OK then , I assume that nightly build will solve my basic problem of On
 the fly creation of new cores using dataDir as req parameter, then I can
 wait for two more hours.
 One more thing the new nightly build willl be called solr-2009-05-20.tgz,
 right as teh current one  is solr-2009-05-19.tgz, right?


 Thanks,
 KK.

 2009/5/20 Noble Paul നോബിള്‍ नोब्ळ् noble.p...@corp.aol.com

 if you are not familiar w/ svn and ant I suggest wait till the nightly
 build happens it is just 2 hrs away


 On Wed, May 20, 2009 at 12:13 PM, KK dioxide.softw...@gmail.com wrote:
  Do I've to check out this
  http://svn.apache.org/repos/asf/lucene/solr/trunk/
  using this command
  svn checkout http://svn.apache.org/repos/asf/lucene/solr/trunk/
  ??? not much idea about svn chekout from public repo. give me some
 specific
  pointers on this.
 
  and then fire ant to build a fresh solr? Will that create the war to be
  deployed or I've to create the war file for deployment? please give me
  proper directions.
 
  Thanks,
  KK.
 
  2009/5/20 Noble Paul നോബിള്‍ नोब्ळ् noble.p...@corp.aol.com
 
  you will have to wait another 3 hours for the next nightly build to
  happen.
 
  you can of course check out the trunk and build one your own
 
  see this
 
 
 http://lucene.apache.org/solr/version_control.html#Anonymous+Access+(read-only)http://lucene.apache.org/solr/version_control.html#Anonymous+Access+%28read-only%29
 
  now you may need ant to build this.
 
  On Wed, May 20, 2009 at 11:53 AM, KK dioxide.softw...@gmail.com
 wrote:
   I downloaded the nightly build . How do I get the trunk and where to
 get
   it
   from? I've never used trunks earlier . Please help me out. Tell me
 the
   detailed steps for all this.
  
   Thanks,
   KK.
  
   2009/5/20 Noble Paul നോബിള്‍ नोब्ळ् noble.p...@corp.aol.com
  
   if you are using a trunk version you can still use the dataDir
   attribute while creating the core. so you are not really stuck
  
   I have committed the fix
  
   On Wed, May 20, 2009 at 11:45 AM, KK dioxide.softw...@gmail.com
   wrote:
Please do so ASAP. I'm stuck just becaue of that. If not then I
 might
have
to move to lucene. I guess lucene has support for on the fly
 creation
of
new
indexes. Any idea on this?
   
Thanks,
KK.
   
2009/5/20 Noble Paul നോബിള്‍ नोब्ळ् noble.p...@corp.aol.com
   
hi KK,
the fix got removed in a subsequent refactoring I guess. I shall
check
in soon and you can get it from the trunk.
   
On Wed, May 20, 2009 at 11:29 AM, KK dioxide.softw...@gmail.com
 
wrote:
 I downloaded the nightly build and deployed the
 dist/solr-nightly-build.war
 in tomcat with all other config same @solr.home. Trying to
 create
 a
 new
 core
 gave me the same error. Do I still need to apply the patch you
 mentioned
 earlier. I'm stuck, please help me out.
 I used this to create a new core,




 http://localhost:8080/solr/admin/cores?action=CREATEname=core2instanceDir=/opt/solr/

 my solr home is /opt/solr which has a conf directory containing
 solrconfig
 and schema files.

 Thanks,
 KK

 message No system property or default value specified for
 solr.core.name
 org.apache.solr.common.SolrException: No system property or
 default
 value
 specified for solr.core.name at



 org.apache.solr.common.util.DOMUtil.substituteProperty(DOMUtil.java:311)
 at




 org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:264)
 at




 org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:272)
 at




 org.apache.solr.common.util.DOMUtil.substituteProperties(DOMUtil.java:272)
 at org.apache.solr.core.Config.init(Config.java:105) at
 org.apache.solr.core.SolrConfig.init(SolrConfig.java:113) at

 org.apache.solr.core.CoreContainer.create(CoreContainer.java:321)
 at




 

Optimize

2009-05-20 Thread Gargate, Siddharth
Hi all,

I am not sure how to call optimize on the existing
index. I tried with following URL

http://localhost:9090/solr/update?optimize=true

 

With this request, the response took a long time, and the index folder
size doubled. Then again I queried the same URL and index size reduced
to actual size and I got the response immediately. 

Is this expected behavior? Is there any other way to call optimize?

 

Thanks,

Siddharth



Re: What are the basic requirements for on-the-fly registration/creation of new Core?

2009-05-20 Thread KK
Thanks for the response.
That means I've to have the directory before I pass it to solr, its not
going to create it by itself. Or just passing the name will make it create a
new directory? I've to give the full path? Seems I wont be able to register
a new core on the fly by just passing the name. Do confirm me.

Thanks,
KK.

2009/5/20 Noble Paul നോബിള്‍ नोब्ळ् noble.p...@corp.aol.com

 hi KK,
 instead of keeping the dataDir in the solrconfig.xml you can pass it
 as a parameter while creating the core.

 you can use that for the moment and we can test the other on eand let u
 know


 On Wed, May 20, 2009 at 3:10 PM, KK dioxide.softw...@gmail.com wrote:
  Hi Noble,
  I downloaded the latest nightly build(20th may) and deployed the
  nightly-build.war and tried to run the same thing for creating a new core
  and the bad news is that it didn't work. I tried  this
  dataDir/opt/solr/data/${core.name}/dataDir also
  dataDir/opt/solr/data/${solr.core.name}/dataDir but it gave me the
 same
  error saying
 
  HTTP Status 500 - No system property or default value specified for
  core.name org.apache.solr.common.SolrException: No system property or
  default value specified for core.name at
  org.apache.solr.common.util.DOMUtil.substituteProperty(DOMUtil.java:311)
 at
  ...
 
  Can you please help me out. The config files are the old ones that were
  there solrconfig.xml and schema.xml withe an entry for data directory as
  you've told me earlier. Do I've to pick the new solrconfig files. I'm
 still
  stuck...
 
  Thanks,
  KK.
 
  On Wed, May 20, 2009 at 12:59 PM, KK dioxide.softw...@gmail.com wrote:
 
  OK then , I assume that nightly build will solve my basic problem of On
  the fly creation of new cores using dataDir as req parameter, then I
 can
  wait for two more hours.
  One more thing the new nightly build willl be called
 solr-2009-05-20.tgz,
  right as teh current one  is solr-2009-05-19.tgz, right?
 
  Thanks,
  KK.
 
  2009/5/20 Noble Paul നോബിള്‍ नोब्ळ् noble.p...@corp.aol.com
 
  if you are not familiar w/ svn and ant I suggest wait till the nightly
  build happens it is just 2 hrs away
 
 
  On Wed, May 20, 2009 at 12:13 PM, KK dioxide.softw...@gmail.com
 wrote:
   Do I've to check out this
   http://svn.apache.org/repos/asf/lucene/solr/trunk/
   using this command
   svn checkout http://svn.apache.org/repos/asf/lucene/solr/trunk/
   ??? not much idea about svn chekout from public repo. give me some
   specific
   pointers on this.
  
   and then fire ant to build a fresh solr? Will that create the war to
 be
   deployed or I've to create the war file for deployment? please give
 me
   proper directions.
  
   Thanks,
   KK.
  
   2009/5/20 Noble Paul നോബിള്‍ नोब्ळ् noble.p...@corp.aol.com
  
   you will have to wait another 3 hours for the next nightly build to
   happen.
  
   you can of course check out the trunk and build one your own
  
   see this
  
  
  
 http://lucene.apache.org/solr/version_control.html#Anonymous+Access+(read-only)http://lucene.apache.org/solr/version_control.html#Anonymous+Access+%28read-only%29
  
   now you may need ant to build this.
  
   On Wed, May 20, 2009 at 11:53 AM, KK dioxide.softw...@gmail.com
   wrote:
I downloaded the nightly build . How do I get the trunk and where
 to
get
it
from? I've never used trunks earlier . Please help me out. Tell me
the
detailed steps for all this.
   
Thanks,
KK.
   
2009/5/20 Noble Paul നോബിള്‍ नोब्ळ् noble.p...@corp.aol.com
   
if you are using a trunk version you can still use the dataDir
attribute while creating the core. so you are not really stuck
   
I have committed the fix
   
On Wed, May 20, 2009 at 11:45 AM, KK dioxide.softw...@gmail.com
 
wrote:
 Please do so ASAP. I'm stuck just becaue of that. If not then I
 might
 have
 to move to lucene. I guess lucene has support for on the fly
 creation
 of
 new
 indexes. Any idea on this?

 Thanks,
 KK.

 2009/5/20 Noble Paul നോബിള്‍ नोब्ळ् noble.p...@corp.aol.com

 hi KK,
 the fix got removed in a subsequent refactoring I guess. I
 shall
 check
 in soon and you can get it from the trunk.

 On Wed, May 20, 2009 at 11:29 AM, KK
 dioxide.softw...@gmail.com
 wrote:
  I downloaded the nightly build and deployed the
  dist/solr-nightly-build.war
  in tomcat with all other config same @solr.home. Trying to
  create
  a
  new
  core
  gave me the same error. Do I still need to apply the patch
 you
  mentioned
  earlier. I'm stuck, please help me out.
  I used this to create a new core,
 
 
 
 
 
 http://localhost:8080/solr/admin/cores?action=CREATEname=core2instanceDir=/opt/solr/
 
  my solr home is /opt/solr which has a conf directory
  containing
  solrconfig
  and schema files.
 
  Thanks,
  KK
 
  message No system property or default 

Re: Optimize

2009-05-20 Thread Erik Hatcher

To send an optimize command, POST an optimize/ message to /solr/update

Erik


On May 20, 2009, at 6:49 AM, Gargate, Siddharth wrote:


Hi all,

   I am not sure how to call optimize on the existing
index. I tried with following URL

http://localhost:9090/solr/update?optimize=true



With this request, the response took a long time, and the index folder
size doubled. Then again I queried the same URL and index size reduced
to actual size and I got the response immediately.

Is this expected behavior? Is there any other way to call optimize?



Thanks,

Siddharth





Re: How to retrieve all available Cores in a static way ?

2009-05-20 Thread Ryan McKinley

I cringe to suggest this but you can use the deprecated call:
 SolrCore.getSolrCore().getCoreContainer()


On May 19, 2009, at 11:21 AM, Giovanni De Stefano wrote:


Hello all,

I have a quick question but I cannot find a quick answer :-)

I have a Java client running on the same JVM where Solr is running.

The Solr I have is a multicore.

How can I retrieve from the Java client the different cores available?

I tried with:

...
CoreContainer container = new CoreContainer();
CollectionSolrCore cores = container.getCores();
...

but I get nothing useful... :-(

Is there any static method that lets me get this collection?

Thanks a lot!

Giovanni




dataimport.properties; configure writable location?

2009-05-20 Thread Wesley Small
In Solr 1.3, is there a setting that allows one to modified the where the
dataimport.properties file resides?

In a production environment, the solrconfig directory needs to be read-only.
I have observed that the DIH process works regards, but a whooping errors is
put in the logs when the dataimport.properties obviously cannot be
created/written to.

Thanks,
Wesley



Re: Plugin Not Found

2009-05-20 Thread Jeff Newburn
Error is below. This error does not appear when I manually copy the jar file
into the tomcat webapp directory only when I try to put it in the solr.home
lib directory.

SEVERE: org.apache.solr.common.SolrException: Error loading class
'org.apache.solr.handler.component.FacetCubeComponent'
at 
org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:31
0)
at 
org.apache.solr.core.SolrResourceLoader.newInstance(SolrResourceLoader.java:
325)
at 
org.apache.solr.util.plugin.AbstractPluginLoader.create(AbstractPluginLoader
.java:84)
at 
org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.j
ava:141)
at org.apache.solr.core.SolrCore.loadSearchComponents(SolrCore.java:841)
at org.apache.solr.core.SolrCore.init(SolrCore.java:528)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:350)
at org.apache.solr.core.CoreContainer.load(CoreContainer.java:227)
at 
org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java
:107)
at 
org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:69)
at 
org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilter
Config.java:275)
at 
org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFil
terConfig.java:397)
at 
org.apache.catalina.core.ApplicationFilterConfig.init(ApplicationFilterCon
fig.java:108)
at 
org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:37
09)
at 
org.apache.catalina.core.StandardContext.start(StandardContext.java:4356)
at 
org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:7
91)
at 
org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:771)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:525)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:829)
at 
org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:718)
at 
org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:490)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1147)
at 
org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:311)
at 
org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSuppor
t.java:117)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1053)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:719)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
at 
org.apache.catalina.core.StandardEngine.start(StandardEngine.java:443)
at 
org.apache.catalina.core.StandardService.start(StandardService.java:516)
at 
org.apache.catalina.core.StandardServer.start(StandardServer.java:710)
at org.apache.catalina.startup.Catalina.start(Catalina.java:578)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39
)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:288)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:413)
Caused by: java.lang.ClassNotFoundException:
org.apache.solr.handler.component.FacetCubeComponent
at 
org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.jav
a:1360)
at 
org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.jav
a:1206)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at 
org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:29
4)
... 36 more

-- 
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562


 From: Noble Paul നോബിള്‍  नोब्ळ् noble.p...@corp.aol.com
 Reply-To: solr-user@lucene.apache.org
 Date: Wed, 20 May 2009 10:44:57 +0530
 To: solr-user@lucene.apache.org
 Subject: Re: Plugin Not Found
 
 what is the error message you see when you start Solr?
 
 On Wed, May 20, 2009 at 4:44 AM, Jeff Newburn jnewb...@zappos.com wrote:
 I am trying to get a custom plugin to work properly.  When I put the jar
 into the unpacked webapp directory for solr it works fine but when I put it
 in the lib directory in the solr home nothing works. Is there something
 missing besides just dropping it into the directory?
 
 Works:
 {solr.home}/tomcat/webapp/solr/WEB-INF/lib/
 
 Doesn’t Work:
 {solr.home}/lib/
 
 //Code snippets
 package org.apache.solr.handler.component;
 public class FacetCubeComponent extends SearchComponent implements
 SolrCoreAware
 SolrConfig
  searchComponent name=facetcube
 class=org.apache.solr.handler.component.FacetCubeComponent/
         arr name=last-components
               strspellcheck/str
               strfacetcube/str
    

Re: Plugin Not Found

2009-05-20 Thread Grant Ingersoll

Just a wild guess here, but...

Try doing one of two things:
1. change the package name to be something other than o.a.s
2. Change your config to use solr.FacetCubeComponent

You might also try turning on trace level logging for the  
SolrResourceLoader and report back the output.


-Grant

On May 20, 2009, at 10:20 AM, Jeff Newburn wrote:

Error is below. This error does not appear when I manually copy the  
jar file
into the tomcat webapp directory only when I try to put it in the  
solr.home

lib directory.

SEVERE: org.apache.solr.common.SolrException: Error loading class
'org.apache.solr.handler.component.FacetCubeComponent'
   at
org 
.apache 
.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:31

0)
   at
org 
.apache 
.solr.core.SolrResourceLoader.newInstance(SolrResourceLoader.java:

325)
   at
org 
.apache 
.solr.util.plugin.AbstractPluginLoader.create(AbstractPluginLoader

.java:84)
   at
org 
.apache 
.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.j

ava:141)
   at  
org.apache.solr.core.SolrCore.loadSearchComponents(SolrCore.java:841)

   at org.apache.solr.core.SolrCore.init(SolrCore.java:528)
   at org.apache.solr.core.CoreContainer.create(CoreContainer.java: 
350)

   at org.apache.solr.core.CoreContainer.load(CoreContainer.java:227)
   at
org.apache.solr.core.CoreContainer 
$Initializer.initialize(CoreContainer.java

:107)
   at
org 
.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java: 
69)

   at
org 
.apache 
.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilter

Config.java:275)
   at
org 
.apache 
.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFil

terConfig.java:397)
   at
org 
.apache 
.catalina.core.ApplicationFilterConfig.init(ApplicationFilterCon

fig.java:108)
   at
org 
.apache 
.catalina.core.StandardContext.filterStart(StandardContext.java:37

09)
   at
org.apache.catalina.core.StandardContext.start(StandardContext.java: 
4356)

   at
org 
.apache 
.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:7

91)
   at
org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java: 
771)
   at  
org.apache.catalina.core.StandardHost.addChild(StandardHost.java:525)
   at  
org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:829)

   at
org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:718)
   at
org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:490)
   at org.apache.catalina.startup.HostConfig.start(HostConfig.java: 
1147)

   at
org 
.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java: 
311)

   at
org 
.apache 
.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSuppor

t.java:117)
   at  
org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1053)
   at org.apache.catalina.core.StandardHost.start(StandardHost.java: 
719)
   at  
org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)

   at
org.apache.catalina.core.StandardEngine.start(StandardEngine.java:443)
   at
org.apache.catalina.core.StandardService.start(StandardService.java: 
516)

   at
org.apache.catalina.core.StandardServer.start(StandardServer.java:710)
   at org.apache.catalina.startup.Catalina.start(Catalina.java:578)
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   at
sun 
.reflect 
.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39

)
   at
sun 
.reflect 
.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl

.java:25)
   at java.lang.reflect.Method.invoke(Method.java:597)
   at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:288)
   at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:413)
Caused by: java.lang.ClassNotFoundException:
org.apache.solr.handler.component.FacetCubeComponent
   at
org 
.apache 
.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.jav

a:1360)
   at
org 
.apache 
.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.jav

a:1206)
   at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
   at java.lang.Class.forName0(Native Method)
   at java.lang.Class.forName(Class.java:247)
   at
org 
.apache 
.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:29

4)
   ... 36 more

--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562


From: Noble Paul നോബിള്‍  नोब्ळ्  
noble.p...@corp.aol.com

Reply-To: solr-user@lucene.apache.org
Date: Wed, 20 May 2009 10:44:57 +0530
To: solr-user@lucene.apache.org
Subject: Re: Plugin Not Found

what is the error message you see when you start Solr?

On Wed, May 20, 2009 at 4:44 AM, Jeff Newburn jnewb...@zappos.com  
wrote:
I am trying to get a custom plugin to work properly.  When I put  
the jar
into the unpacked webapp directory for solr it works fine but when  
I put it
in the lib directory in the solr home nothing works. Is there  
something

missing besides just dropping it into the directory?

Works:

Re: QueryElevationComponent : hot update of elevate.xml

2009-05-20 Thread Nicolas Pastorino

Hi,

On May 12, 2009, at 12:33 , Nicolas Pastorino wrote:


Hi,

On May 7, 2009, at 6:03 , Noble Paul നോബിള്‍  
नोब्ळ् wrote:



going forward the java based replication is going to be the preferred
means replicating index. It does not support replicating files in the
dataDir , it only supports replicating index files and conf files
(files in conf dir). I was unaware of the fact that it was  
possible to

put the elevate.xml in dataDir.

reloading on commit is a trivial for a search component. it can
register itself to be an even listener for commit and do a reload of
elevate.xml. This can be a configuration parameter.

str name=refreshOnCommmittrue/str


Thanks for these nice tips and recommendations.
I attached a new version of this requestHandler here : https:// 
issues.apache.org/jira/browse/SOLR-1147.
As proposed in the comments of this issue, it was rescoped to a more  
generic feature :  Updating configuration files through  
HTTP ( https://issues.apache.org/jira/browse/SOLR-1147 ).

Discussion continues in the issue tracker and right here !

Regards,
--
Nicolas Pastorino
eZ Systems ( Labs )

Re: dataimport.properties; configure writable location?

2009-05-20 Thread Wesley Small
Is a place in a core's solrconfig, where one can set the directory/path
where the dataimport.properties file is written to?


On 5/20/09 2:09 PM, Giovanni De Stefano giovanni.destef...@gmail.com
wrote:

 Doh,
 
 can you please rephrase?
 
 Giovanni
 
 On Wed, May 20, 2009 at 3:47 PM, Wesley Small
 wesley.sm...@mtvstaff.comwrote:
 
 In Solr 1.3, is there a setting that allows one to modified the where the
 dataimport.properties file resides?
 
 In a production environment, the solrconfig directory needs to be
 read-only.
 I have observed that the DIH process works regards, but a whooping errors
 is
 put in the logs when the dataimport.properties obviously cannot be
 created/written to.
 
 Thanks,
 Wesley
 
 



Re: best way to cache base queries (before application of filters)

2009-05-20 Thread Yonik Seeley
On Wed, May 20, 2009 at 12:07 PM, Otis Gospodnetic
otis_gospodne...@yahoo.com wrote:
 Solr plays nice with HTTP caches.  Perhaps the simplest solution is to put 
 Solr behind a caching server such as Varnish, Squid, or even Apache?

In Kent's case, the other query parameters (the other filters mainly)
change, so an external cache won't help.

-Yonik
http://www.lucidimagination.com


Re: Issue with AND/OR Operator in Dismax Request

2009-05-20 Thread Otis Gospodnetic

Amit,

That's the same question as the other day, right?
Yes, DisMax doesn't play well with Boolean operators.  Check JIRA, it has a 
search box, so you may be able to find related patches.
I think the patch I was thinking about is actually for something else - 
allowing field names to be specified in query string and DixMax handling that 
correctly.

 Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch



- Original Message 
 From: dabboo ag...@sapient.com
 To: solr-user@lucene.apache.org
 Sent: Wednesday, May 20, 2009 1:35:00 AM
 Subject: Issue with AND/OR Operator in Dismax Request
 
 
 Hi, 
 
 I am not getting correct results with a Query which has multiple AND | OR
 operator. 
 
 Query Format q=((A AND B) OR (C OR D) OR E) 
 
 ?q=((intAgeFrom_product_i:[0+TO+3]+AND+intAgeTo_product_i:[3+TO+*])+OR+(intAgeFrom_product_i:[0+TO+3]+AND+intAgeTo_product_i:[0+TO+3])+OR+(ageFrom_product_s:Adult))qt=dismaxrequest
  
 
 
 Query return correct result without Dismaxrequest, but incorrect results
 with Dismaxrequest. 
 
 I have to use dismaxrequest because i need boosting of search results 
 
 According to some posts there are issues with AND | OR operator with
 dismaxrequest. 
 Please let me know if anyone has faced the same problem and if there is any
 way to make the query work with dismaxrequest. 
 
 I also believe that there is some patch available for this in one of the
 JIRA. I would appreciate if somebody can let me know the URL, so that I can
 take a look at the patch.
 
 Thanks for the help.
 
 Thanks, 
 Amit Garg
 -- 
 View this message in context: 
 http://www.nabble.com/Issue-with-AND-OR-Operator-in-Dismax-Request-tp23629269p23629269.html
 Sent from the Solr - User mailing list archive at Nabble.com.



Re: phrase query word delimiting

2009-05-20 Thread Avlesh Singh
Using a NGramTokenizerFactory as an analyzer for your field would help you
achieve the desired.
Here's a nice article -
http://coderrr.wordpress.com/2008/05/08/substring-queries-with-solr-acts_as_solr/

Cheers
Avlesh

On Wed, May 20, 2009 at 11:26 PM, Alex Life av_l...@yahoo.com wrote:


 Hi All,

 Could you please help?

 I have following document
 Super PowerShot SD

 I want this document to be found by phrase queries below (both):
 super powershot sd
 super power shot sd

 Is this possible without sloppy phrase query? (at least theoretical)
 I don't see any way setting term/positions so both queries would be
 successful :-(

 Thanks
 Alex






Cleanly shutting down Solr/Jetty on Windows

2009-05-20 Thread Chris Harris
I'm running Solr with the default Jetty setup on Windows. If I start
solr with java -jar start.jar from a command window, then I can
cleanly shut down Solr/Jetty by hitting Control-C. In particular, this
causes the shutdown hook to execute, which appears to be important.

However, I don't especially want to run Solr from a command window.
Instead, I want to launch it from a scheduled task, which
does the java -jar start.jar in a non-interactive way and which
does not bring up a command window. If I were on unix I could
use the kill command to send an appropriate signal to the JVM, but
I gather this doesn't work on Windows.

As such, what is the proper way to cleanly shut down Solr/Jetty on Windows,
if they are not running in a command window? The main way I know how to kill
Solr right now if it's running outside a command window is to go to the
Windows task manager and kill the java.exe process there. But this seems
to kill java immediately, so I'm doubtful that the shutdown hook is
getting executed.

I found a couple of threads through Google suggesting that Jetty now has a
stop.jar
script that's capable of stopping Jetty in a clean way across platforms.
Is this maybe the best option? If so, would it be possible to include
stop.jar in the Solr example/ directory?


RE: Creating a distributed search in a searchComponent

2009-05-20 Thread Nick Bailey
It seems I sent this out a bit too soon.  After looking at the source it seems 
there are two seperate paths for distributed and regular queries, however the 
prepare method for for all components is run before the shards parameter is 
checked.  So I can build the shards portion by using the prepare method of the 
my own search component.  

However I'm not sure if this is the greatest idea in case solr changes at some 
point.

-Nick

-Original Message-
From: Nick Bailey nicholas.bai...@rackspace.com
Sent: Wednesday, May 20, 2009 1:29pm
To: solr-user@lucene.apache.org
Subject: Creating a distributed search in a searchComponent

Hi,

I am wondering if it is possible to basically add the distributed portion of a 
search query inside of a searchComponent.

I am hoping to build my own component and add it as a first-component to the 
StandardRequestHandler.  Then hopefully I will be able to use this component to 
build the shards parameter of the query and have the Handler then treat the 
query as a distributed search.  Anyone have any experience or know if this is 
possible?

Thanks,
Nick





Re: java.lang.RuntimeException: after flush: fdx size mismatch

2009-05-20 Thread James X
Hi Mike, thanks for the quick response:

$ java -version
java version 1.6.0_11
Java(TM) SE Runtime Environment (build 1.6.0_11-b03)
Java HotSpot(TM) 64-Bit Server VM (build 11.0-b16, mixed mode)

I hadn't noticed the 268m trigger for LUCENE-1521 - I'm definitely not
hitting that yet!

The exception always reports 0 length, but the number of of docs varies,
heavily weighted towards 1 or two docs. Of the last 130 or so exceptions:
 89 1 docs vs 0 length
 20 2 docs vs 0 length
  9 3 docs vs 0 length
  1 4 docs vs 0 length
  3 5 docs vs 0 length
  2 6 docs vs 0 length
  1 7 docs vs 0 length
  1 9 docs vs 0 length
  1 10 docs vs 0 length

The only unusual thing I can think of that we're doing with Solr is
aggressively CREATE-ing and UNLOAD-ing cores. I've not been able to spot a
pattern between core admin operations and these exceptions, however...

James

On Wed, May 20, 2009 at 2:37 AM, Michael McCandless 
luc...@mikemccandless.com wrote:

 Hmm... somehow Lucene is flushing a new segment on closing the
 IndexWriter, and thinks 1 doc had been added to the stored fields
 file, yet the fdx file is the wrong size (0 bytes).  This check (
 exception) are designed to prevent corruption from entering the index,
 so it's at least good to see CheckIndex passes after this.

 I don't think you're hitting LUCENE-1521: that issue only happens if a
 single segment has more than ~268 million docs.

 Which exact JRE version are you using?

 When you hit this exception, is it always 1 docs vs 0 length in bytes?

 Mike

 On Wed, May 20, 2009 at 3:19 AM, James X
 hello.nigerian.spamm...@gmail.com wrote:
  Hello all,I'm running Solr 1.3 in a multi-core environment. There are up
 to
  2000 active cores in each Solr webapp instance at any given time.
 
  I've noticed occasional errors such as:
  SEVERE: java.lang.RuntimeException: after flush: fdx size mismatch: 1
 docs
  vs 0 length in bytes of _h.fdx
 at
 
 org.apache.lucene.index.StoredFieldsWriter.closeDocStore(StoredFieldsWriter.java:94)
 at
 
 org.apache.lucene.index.DocFieldConsumers.closeDocStore(DocFieldConsumers.java:83)
 at
 
 org.apache.lucene.index.DocFieldProcessor.closeDocStore(DocFieldProcessor.java:47)
 at
 
 org.apache.lucene.index.DocumentsWriter.closeDocStore(DocumentsWriter.java:367)
 at
  org.apache.lucene.index.DocumentsWriter.flush(DocumentsWriter.java:567)
 at
  org.apache.lucene.index.IndexWriter.doFlush(IndexWriter.java:3540)
 at
 org.apache.lucene.index.IndexWriter.flush(IndexWriter.java:3450)
 at
  org.apache.lucene.index.IndexWriter.closeInternal(IndexWriter.java:1638)
 at
 org.apache.lucene.index.IndexWriter.close(IndexWriter.java:1602)
 at
 org.apache.lucene.index.IndexWriter.close(IndexWriter.java:1578)
 at
  org.apache.solr.update.SolrIndexWriter.close(SolrIndexWriter.java:153)
 
  during commit / optimise operations.
 
  These errors then cause cascading errors during updates on the offending
  cores:
  SEVERE: org.apache.lucene.store.LockObtainFailedException: Lock obtain
 timed
  out: SingleInstanceLock: write.lock
 at org.apache.lucene.store.Lock.obtain(Lock.java:85)
 at org.apache.lucene.index.IndexWriter.init(IndexWriter.java:1070)
 at
 org.apache.lucene.index.IndexWriter.init(IndexWriter.java:924)
 at
  org.apache.solr.update.SolrIndexWriter.init(SolrIndexWriter.java:116)
 at
 
 org.apache.solr.update.UpdateHandler.createMainIndexWriter(UpdateHandler.java:122)
 
  This looks like http://issues.apache.org/jira/browse/LUCENE-1521, but
 when I
  upgraded Lucene to 2.4.1 under Solr 1.3, the issue still remains.
 
  CheckIndex doesn't find any problems with the index, and problems
 disappear
  after an (inconvenient, for me) restart of Solr.
 
  Firstly, can I as the symptoms are so close to those in 1521, can I check
 my
  Lucene upgrade method should work:
  - unzip the Solr 1.3 war
  - remove the Lucene 2.4dev jars
  (lucene-core, lucene-spellchecker, lucene-snowball, lucene-queries,
  lucene-memory,lucene-highlighter, lucene-analyzers)
  - move in the Lucene 2.4.1 jars
  - rezip the directory structures as solr.war.
 
  I think this has worked, as solr/default/admin/registry.jsp shows:
   lucene-spec-version2.4.1/lucene-spec-version
   lucene-impl-version2.4.1 750176 - 2009-03-04
  21:56:52/lucene-impl-version
 
  Secondly, if this Lucene fix isn't the right solution to this problem,
 can
  anyone suggest an alternative approach? The only problems I've had up to
 now
  is to do with the number of allowed file handles, which was fixed by
  changing limits.conf (RHEL machine).
 
  Many thanks!
  James
 



Facet counts limit

2009-05-20 Thread sachin78

Have two questions?

1) What is the limit on facet counts?  ex : test(10,0).Is this valid?

2) What is the limit on the no of facets? how many facets can a query get?

--Sachin
-- 
View this message in context: 
http://www.nabble.com/Facet-counts-limit-tp23641105p23641105.html
Sent from the Solr - User mailing list archive at Nabble.com.



Re: best way to cache base queries (before application of filters)

2009-05-20 Thread Walter Underwood
How often do you update the indexes? We update once per day, and our
HTTP cache has a hit rate of 75% once it gets warmed up.

wunder

On 5/20/09 9:07 AM, Otis Gospodnetic otis_gospodne...@yahoo.com wrote:
 
 Kent,
 
 Solr plays nice with HTTP caches.  Perhaps the simplest solution is to put
 Solr behind a caching server such as Varnish, Squid, or even Apache?
 
 Otis
 --
 Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
 
 
 
 - Original Message 
 From: Kent Fitch kent.fi...@gmail.com
 To: solr-user@lucene.apache.org
 Sent: Wednesday, May 20, 2009 3:47:02 AM
 Subject: best way to cache base queries (before application of filters)
 
 Hi,  I'm looking for some advice on how to add base query caching to SOLR.
 
 Our use-case for SOLR is:
 
 - a large Lucene index (32M docs, doubling in 6 months, 110GB increasing x 8
 in 6 months)
 - a frontend which presents views of this data in 5 categories by firing
 off 5 queries with the same search term but 5 different fq values
 
 For example, an originating query for sydney harbour generates 5 SOLR
 queries:
 
 - ../search?q=fq=category:books
 - ../search?q=fq=category:maps
 - ../search?q=fq=category:music
 etc
 
 The complicated expansion requiring sloppy phrase matches, and the large
 database with lots of very large documents means that some queries take
 quite some time (10's to several 100's of ms), so we'd like to cache the
 results of the base query for a short time (long enough for all related
 queries to be issued).
 
 It looks like this isnt the use-case for queryResultCache, because its key
 is calculated in SolrIndexSearcher like this:
 
 key = new QueryResultKey(cmd.getQuery(), cmd.getFilterList(), cmd.getSort(),
 cmd.getFlags());
 
 That is, the filters are part of the key; and the result that's cached
 results reflects the application of the filters, and this works great for
 what it is probably designed for - supporting paging through results.
 
 So, I think our options are:
 
 - create a new queryComponent that invokes SolrIndexSearcher differently,
 and which has its own (short lived but long entry length) cache of the base
 query results
 
 - subclass or change SolrIndexSearcher, perhaps making it pluggable,
 perhaps defining an optional new cache of base query results
 
 - create a sublcass of the Lucene IndexSearcher which manages a cache of
 query results hidden from SolrIndexSearcher (and organise somehow for
 SolrIndexSearcher to use that sublass)
 
 Or perhaps Im taking the wrong approach to this problem entirely!  Any
 advice is greatly appreciated.
 
 Kent Fitch
 



Re: Issue with AND/OR Operator in Dismax Request

2009-05-20 Thread dabboo

Hi,

Yeah you are right. Can you please tell me the URL of JIRA.

Thanks,
Amit



Otis Gospodnetic wrote:
 
 
 Amit,
 
 That's the same question as the other day, right?
 Yes, DisMax doesn't play well with Boolean operators.  Check JIRA, it has
 a search box, so you may be able to find related patches.
 I think the patch I was thinking about is actually for something else -
 allowing field names to be specified in query string and DixMax handling
 that correctly.
 
  Otis
 --
 Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
 
 
 
 - Original Message 
 From: dabboo ag...@sapient.com
 To: solr-user@lucene.apache.org
 Sent: Wednesday, May 20, 2009 1:35:00 AM
 Subject: Issue with AND/OR Operator in Dismax Request
 
 
 Hi, 
 
 I am not getting correct results with a Query which has multiple AND | OR
 operator. 
 
 Query Format q=((A AND B) OR (C OR D) OR E) 
 
 ?q=((intAgeFrom_product_i:[0+TO+3]+AND+intAgeTo_product_i:[3+TO+*])+OR+(intAgeFrom_product_i:[0+TO+3]+AND+intAgeTo_product_i:[0+TO+3])+OR+(ageFrom_product_s:Adult))qt=dismaxrequest
  
 
 
 Query return correct result without Dismaxrequest, but incorrect results
 with Dismaxrequest. 
 
 I have to use dismaxrequest because i need boosting of search results 
 
 According to some posts there are issues with AND | OR operator with
 dismaxrequest. 
 Please let me know if anyone has faced the same problem and if there is
 any
 way to make the query work with dismaxrequest. 
 
 I also believe that there is some patch available for this in one of the
 JIRA. I would appreciate if somebody can let me know the URL, so that I
 can
 take a look at the patch.
 
 Thanks for the help.
 
 Thanks, 
 Amit Garg
 -- 
 View this message in context: 
 http://www.nabble.com/Issue-with-AND-OR-Operator-in-Dismax-Request-tp23629269p23629269.html
 Sent from the Solr - User mailing list archive at Nabble.com.
 
 
 

-- 
View this message in context: 
http://www.nabble.com/Issue-with-AND-OR-Operator-in-Dismax-Request-tp23629269p23639786.html
Sent from the Solr - User mailing list archive at Nabble.com.



phrase query word delimiting

2009-05-20 Thread Alex Life

Hi All,

Could you please help?

I have following document
Super PowerShot SD

I want this document to be found by phrase queries below (both):
super powershot sd
super power shot sd

Is this possible without sloppy phrase query? (at least theoretical)
I don't see any way setting term/positions so both queries would be successful 
:-(

Thanks
Alex


  


Creating a distributed search in a searchComponent

2009-05-20 Thread Nick Bailey
Hi,

I am wondering if it is possible to basically add the distributed portion of a 
search query inside of a searchComponent.

I am hoping to build my own component and add it as a first-component to the 
StandardRequestHandler.  Then hopefully I will be able to use this component to 
build the shards parameter of the query and have the Handler then treat the 
query as a distributed search.  Anyone have any experience or know if this is 
possible?

Thanks,
Nick




Re: dataimport.properties; configure writable location?

2009-05-20 Thread Giovanni De Stefano
Doh,

can you please rephrase?

Giovanni

On Wed, May 20, 2009 at 3:47 PM, Wesley Small wesley.sm...@mtvstaff.comwrote:

 In Solr 1.3, is there a setting that allows one to modified the where the
 dataimport.properties file resides?

 In a production environment, the solrconfig directory needs to be
 read-only.
 I have observed that the DIH process works regards, but a whooping errors
 is
 put in the logs when the dataimport.properties obviously cannot be
 created/written to.

 Thanks,
 Wesley




Re: How to retrieve all available Cores in a static way ?

2009-05-20 Thread Giovanni De Stefano
Thank you all for your replies.

I guess I will stick with another approach: all my request handlers inherit
from a custom base handler which is CoreAware.

Its inform(core) method notifies a static map hold by another object
avoiding duplicates.

Thanks again!

Giovanni

On Wed, May 20, 2009 at 3:17 PM, Ryan McKinley ryan...@gmail.com wrote:

 I cringe to suggest this but you can use the deprecated call:
  SolrCore.getSolrCore().getCoreContainer()



 On May 19, 2009, at 11:21 AM, Giovanni De Stefano wrote:

  Hello all,

 I have a quick question but I cannot find a quick answer :-)

 I have a Java client running on the same JVM where Solr is running.

 The Solr I have is a multicore.

 How can I retrieve from the Java client the different cores available?

 I tried with:

 ...
 CoreContainer container = new CoreContainer();
 CollectionSolrCore cores = container.getCores();
 ...

 but I get nothing useful... :-(

 Is there any static method that lets me get this collection?

 Thanks a lot!

 Giovanni





Re: best way to cache base queries (before application of filters)

2009-05-20 Thread Yonik Seeley
Some thoughts:

#1) This is sort of already implemented in some form... see this
section of solrconfig.xml and try uncommenting it:

   !-- An optimization that attempts to use a filter to satisfy a search.
 If the requested sort does not include score, then the filterCache
 will be checked for a filter matching the query. If found, the filter
 will be used as the source of document ids, and then the sort will be
 applied to that.
useFilterForSortedQuerytrue/useFilterForSortedQuery
   --

Unfortunately, it's currently a system-wide setting... you can't
select it per-query.

#2) Your problem might be able to be solved with field collapsing on
the category field in the future (but it's not in Solr yet).

#3) Current work I'm doing right now will push Filters down a level
and check them in tandem with the query instead of after.  This should
speed things up by at least a factor of 2 in your case.
https://issues.apache.org/jira/browse/SOLR-1165

I'm trying to get SOLR-1165 finished this week, and I'd love to see
how it affects your performance.
In the meantime, try useFilterForSortedQuery and let us know if it
still works (it's been turned off for a lng time) ;-)

-Yonik
http://www.lucidimagination.com



On Wed, May 20, 2009 at 3:47 AM, Kent Fitch kent.fi...@gmail.com wrote:
 Hi,  I'm looking for some advice on how to add base query caching to SOLR.

 Our use-case for SOLR is:

 - a large Lucene index (32M docs, doubling in 6 months, 110GB increasing x 8
 in 6 months)
 - a frontend which presents views of this data in 5 categories by firing
 off 5 queries with the same search term but 5 different fq values

 For example, an originating query for sydney harbour generates 5 SOLR
 queries:

 - ../search?q=complicated expansion of sydney harbourfq=category:books
 - ../search?q=complicated expansion of sydney harbourfq=category:maps
 - ../search?q=complicated expansion of sydney harbourfq=category:music
 etc

 The complicated expansion requiring sloppy phrase matches, and the large
 database with lots of very large documents means that some queries take
 quite some time (10's to several 100's of ms), so we'd like to cache the
 results of the base query for a short time (long enough for all related
 queries to be issued).

 It looks like this isnt the use-case for queryResultCache, because its key
 is calculated in SolrIndexSearcher like this:

 key = new QueryResultKey(cmd.getQuery(), cmd.getFilterList(), cmd.getSort(),
 cmd.getFlags());

 That is, the filters are part of the key; and the result that's cached
 results reflects the application of the filters, and this works great for
 what it is probably designed for - supporting paging through results.

 So, I think our options are:

 - create a new queryComponent that invokes SolrIndexSearcher differently,
 and which has its own (short lived but long entry length) cache of the base
 query results

 - subclass or change SolrIndexSearcher, perhaps making it pluggable,
 perhaps defining an optional new cache of base query results

 - create a sublcass of the Lucene IndexSearcher which manages a cache of
 query results hidden from SolrIndexSearcher (and organise somehow for
 SolrIndexSearcher to use that sublass)

 Or perhaps Im taking the wrong approach to this problem entirely!  Any
 advice is greatly appreciated.

 Kent Fitch



Re: Plugin Not Found

2009-05-20 Thread Jeff Newburn
I tried to change the package name to com.zappos.solr.

When I declared the search component with:
searchComponent name=facetcube
class=com.zappos.solr.FacetCubeComponent/

I get:
SEVERE: org.apache.solr.common.SolrException: Unknown Search Component:
facetcube
at org.apache.solr.core.SolrCore.getSearchComponent(SolrCore.java:874)
at 
org.apache.solr.handler.component.SearchHandler.inform(SearchHandler.java:12
7)
at 


When I declare the component with solr.FacetCubeComponent I get the same
error message.

When we turned on trace we got the same exception plus
Caused by: java.lang.ClassNotFoundException:
com.zappos.solr.FacetCubeComponent
at 
org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.jav
a:1360)
at 
org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.jav
a:1206)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at 
org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:29
4)
... 27 more



-- 
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562


 From: Grant Ingersoll gsing...@apache.org
 Reply-To: solr-user@lucene.apache.org
 Date: Wed, 20 May 2009 10:38:30 -0400
 To: solr-user@lucene.apache.org
 Subject: Re: Plugin Not Found
 
 Just a wild guess here, but...
 
 Try doing one of two things:
 1. change the package name to be something other than o.a.s
 2. Change your config to use solr.FacetCubeComponent
 
 You might also try turning on trace level logging for the
 SolrResourceLoader and report back the output.
 
 -Grant
 
 On May 20, 2009, at 10:20 AM, Jeff Newburn wrote:
 
 Error is below. This error does not appear when I manually copy the
 jar file
 into the tomcat webapp directory only when I try to put it in the
 solr.home
 lib directory.
 
 SEVERE: org.apache.solr.common.SolrException: Error loading class
 'org.apache.solr.handler.component.FacetCubeComponent'
at
 org 
 .apache 
 .solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:31
 0)
at
 org 
 .apache 
 .solr.core.SolrResourceLoader.newInstance(SolrResourceLoader.java:
 325)
at
 org 
 .apache 
 .solr.util.plugin.AbstractPluginLoader.create(AbstractPluginLoader
 .java:84)
at
 org 
 .apache 
 .solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.j
 ava:141)
at  
 org.apache.solr.core.SolrCore.loadSearchComponents(SolrCore.java:841)
at org.apache.solr.core.SolrCore.init(SolrCore.java:528)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:
 350)
at org.apache.solr.core.CoreContainer.load(CoreContainer.java:227)
at
 org.apache.solr.core.CoreContainer
 $Initializer.initialize(CoreContainer.java
 :107)
at
 org 
 .apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:
 69)
at
 org 
 .apache 
 .catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilter
 Config.java:275)
at
 org 
 .apache 
 .catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFil
 terConfig.java:397)
at
 org 
 .apache 
 .catalina.core.ApplicationFilterConfig.init(ApplicationFilterCon
 fig.java:108)
at
 org 
 .apache 
 .catalina.core.StandardContext.filterStart(StandardContext.java:37
 09)
at
 org.apache.catalina.core.StandardContext.start(StandardContext.java:
 4356)
at
 org 
 .apache 
 .catalina.core.ContainerBase.addChildInternal(ContainerBase.java:7
 91)
at
 org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:
 771)
at  
 org.apache.catalina.core.StandardHost.addChild(StandardHost.java:525)
at  
 org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:829)
at
 org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:718)
at
 org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:490)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:
 1147)
at
 org 
 .apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:
 311)
at
 org 
 .apache 
 .catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSuppor
 t.java:117)
at  
 org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1053)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:
 719)
at  
 org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
at
 org.apache.catalina.core.StandardEngine.start(StandardEngine.java:443)
at
 org.apache.catalina.core.StandardService.start(StandardService.java:
 516)
at
 org.apache.catalina.core.StandardServer.start(StandardServer.java:710)
at org.apache.catalina.startup.Catalina.start(Catalina.java:578)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
 sun 
 .reflect 
 .NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39
 )
at
 sun 
 .reflect 
 .DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
 .java:25)
at 

Re: best way to cache base queries (before application of filters)

2009-05-20 Thread Otis Gospodnetic

Kent,

Solr plays nice with HTTP caches.  Perhaps the simplest solution is to put Solr 
behind a caching server such as Varnish, Squid, or even Apache?

Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch



- Original Message 
 From: Kent Fitch kent.fi...@gmail.com
 To: solr-user@lucene.apache.org
 Sent: Wednesday, May 20, 2009 3:47:02 AM
 Subject: best way to cache base queries (before application of filters)
 
 Hi,  I'm looking for some advice on how to add base query caching to SOLR.
 
 Our use-case for SOLR is:
 
 - a large Lucene index (32M docs, doubling in 6 months, 110GB increasing x 8
 in 6 months)
 - a frontend which presents views of this data in 5 categories by firing
 off 5 queries with the same search term but 5 different fq values
 
 For example, an originating query for sydney harbour generates 5 SOLR
 queries:
 
 - ../search?q=fq=category:books
 - ../search?q=fq=category:maps
 - ../search?q=fq=category:music
 etc
 
 The complicated expansion requiring sloppy phrase matches, and the large
 database with lots of very large documents means that some queries take
 quite some time (10's to several 100's of ms), so we'd like to cache the
 results of the base query for a short time (long enough for all related
 queries to be issued).
 
 It looks like this isnt the use-case for queryResultCache, because its key
 is calculated in SolrIndexSearcher like this:
 
 key = new QueryResultKey(cmd.getQuery(), cmd.getFilterList(), cmd.getSort(),
 cmd.getFlags());
 
 That is, the filters are part of the key; and the result that's cached
 results reflects the application of the filters, and this works great for
 what it is probably designed for - supporting paging through results.
 
 So, I think our options are:
 
 - create a new queryComponent that invokes SolrIndexSearcher differently,
 and which has its own (short lived but long entry length) cache of the base
 query results
 
 - subclass or change SolrIndexSearcher, perhaps making it pluggable,
 perhaps defining an optional new cache of base query results
 
 - create a sublcass of the Lucene IndexSearcher which manages a cache of
 query results hidden from SolrIndexSearcher (and organise somehow for
 SolrIndexSearcher to use that sublass)
 
 Or perhaps Im taking the wrong approach to this problem entirely!  Any
 advice is greatly appreciated.
 
 Kent Fitch



Re: Cleanly shutting down Solr/Jetty on Windows

2009-05-20 Thread Eric Pugh
Wouldn't you want to run it as a windows service and use net start/  
net stop?   If you download and install Jetty it comes with the  
appropriate scripts to be installed as a service.


Eric



On May 20, 2009, at 12:39 PM, Chris Harris wrote:


I'm running Solr with the default Jetty setup on Windows. If I start
solr with java -jar start.jar from a command window, then I can
cleanly shut down Solr/Jetty by hitting Control-C. In particular, this
causes the shutdown hook to execute, which appears to be important.

However, I don't especially want to run Solr from a command window.
Instead, I want to launch it from a scheduled task, which
does the java -jar start.jar in a non-interactive way and which
does not bring up a command window. If I were on unix I could
use the kill command to send an appropriate signal to the JVM, but
I gather this doesn't work on Windows.

As such, what is the proper way to cleanly shut down Solr/Jetty on  
Windows,
if they are not running in a command window? The main way I know how  
to kill
Solr right now if it's running outside a command window is to go to  
the
Windows task manager and kill the java.exe process there. But this  
seems

to kill java immediately, so I'm doubtful that the shutdown hook is
getting executed.

I found a couple of threads through Google suggesting that Jetty now  
has a

stop.jar
script that's capable of stopping Jetty in a clean way across  
platforms.

Is this maybe the best option? If so, would it be possible to include
stop.jar in the Solr example/ directory?


-
Eric Pugh | Principal | OpenSource Connections, LLC | 434.466.1467 | 
http://www.opensourceconnections.com
Free/Busy: http://tinyurl.com/eric-cal






Re: Shutting down an instance of EmbeddedSolrServer

2009-05-20 Thread Eric Pugh

I created ticket SOLR-1178 for the small tweak.

https://issues.apache.org/jira/browse/SOLR-1178

Eric

On May 5, 2009, at 12:26 AM, Noble Paul നോബിള്‍  
नोब्ळ् wrote:



hi Eric,
there should be a getter for CoreContainer in EmbeddedSolrServer.  
Open an issue

--Noble

On Tue, May 5, 2009 at 12:17 AM, Eric Pugh
ep...@opensourceconnections.com wrote:

Hi all,

I notice that when I use EmbeddedSolrServer I have to use Control C  
to stop

the process.  I think the way to shut it down is by calling

coreContainer.shutdown().

However, is it possible to get the coreContainer from a SolrServer  
object?
 Right now it is defined as protected final CoreContainer  
coreContainer;.


I wanted to do:

((EmbeddedSolrServer)solr)getCoreContainer.shutdown();

But is seem I need to keep my own reference to the coreContainer?

Is changing this worth a patch?

Eric

-
Eric Pugh | Principal | OpenSource Connections, LLC | 434.466.1467 |
http://www.opensourceconnections.com
Free/Busy: http://tinyurl.com/eric-cal









--
--Noble Paul


-
Eric Pugh | Principal | OpenSource Connections, LLC | 434.466.1467 | 
http://www.opensourceconnections.com
Free/Busy: http://tinyurl.com/eric-cal






Re: best way to cache base queries (before application of filters)

2009-05-20 Thread Yonik Seeley
On Wed, May 20, 2009 at 12:43 PM, Yonik Seeley
yo...@lucidimagination.com wrote:
    useFilterForSortedQuerytrue/useFilterForSortedQuery

Of course the examples you gave used the default sort (by score) so
this wouldn't help if you do actually need to sort by score.

-Yonik
http://www.lucidimagination.com


Re: best way to cache base queries (before application of filters)

2009-05-20 Thread Walter Underwood
An HTTP cache will still work. We make three or four back end queries
for each search page. We use separate request handlers with filter query
specs instead of putting the filter query in the URL, but those two
approaches are equivalent for the HTTP cache.

We get similar cache hit rates on the faceted browse.

wunder

On 5/20/09 9:14 AM, Yonik Seeley yo...@lucidimagination.com wrote:

 On Wed, May 20, 2009 at 12:07 PM, Otis Gospodnetic
 otis_gospodne...@yahoo.com wrote:
 Solr plays nice with HTTP caches.  Perhaps the simplest solution is to put
 Solr behind a caching server such as Varnish, Squid, or even Apache?
 
 In Kent's case, the other query parameters (the other filters mainly)
 change, so an external cache won't help.
 
 -Yonik
 http://www.lucidimagination.com



Re: Facet counts limit

2009-05-20 Thread Matt Weber
1.  The limit parameter takes a signed integer, so the max value is  
2,147,483,647.
2.  I don't think there is a defined limit which would mean you are  
only limited to want your system can handle.


Thanks,

Matt Weber
eSr Technologies
http://www.esr-technologies.com




On May 20, 2009, at 11:41 AM, sachin78 wrote:



Have two questions?

1) What is the limit on facet counts?  ex : test(10,0).Is this  
valid?


2) What is the limit on the no of facets? how many facets can a  
query get?


--Sachin
--
View this message in context: 
http://www.nabble.com/Facet-counts-limit-tp23641105p23641105.html
Sent from the Solr - User mailing list archive at Nabble.com.





RE: Solr statistics of top searches and results returned

2009-05-20 Thread Plaatje, Patrick
Hi Shalin,

Let me investigate. I think the challenge will be in storingmanaging these 
statistics. I'll get back to the list when I have thought of something.

Rgrds,

Patrick

-Original Message-
From: Shalin Shekhar Mangar [mailto:shalinman...@gmail.com] 
Sent: woensdag 20 mei 2009 10:33
To: solr-user@lucene.apache.org
Subject: Re: Solr statistics of top searches and results returned

On Wed, May 20, 2009 at 1:31 PM, Plaatje, Patrick  
patrick.plaa...@getronics.com wrote:


 At the moment Solr does not have such functionality. I have written a 
 plugin for Solr though which uses a second Solr core to store/index 
 the searches. If you're interested, send me an email and I'll get you 
 the source for the plugin.


Patrick, this will be a useful addition. However instead of doing this with 
another core, we can keep running statistics which can be shown on the 
statistics page itself. What do you think?

A related approach for showing slow queries was discussed recently. There's an 
issue open which has more details:

https://issues.apache.org/jira/browse/SOLR-1101

--
Regards,
Shalin Shekhar Mangar.


XPath query support in Solr Cell

2009-05-20 Thread Eric Pugh
So I am trying to filter down what I am indexing, and the basic XPath  
queries don't work.  For example, working with tutorial.pdf this  
indexes all the div/:


curl http://localhost:8983/solr/update/extract?ext.idx.attr=true 
\ext.def.fl=text\ext.map.div=foo_t\ext.capture=div 
\ext.literal.id=126\ext.xpath=\/xhtml:html\/xhtml:body\/ 
descendant:node\(\)  -F tutori...@tutorial.pdf


However, if I want to only index the first div, I expect to do this:

budapest:site epugh$ curl http://localhost:8983/solr/update/extract?ext.idx.attr=true 
\ext.def.fl=text\ext.map.div=foo_t\ext.capture=div 
\ext.literal.id=126\ext.xpath=\/xhtml:html\/xhtml:body\/ 
xhtml:div[1]  -F tutori...@tutorial.pdf


But I keep getting back an issue from curl.  My attempts to escape the  
[1] have failed.  Any suggestions?


curl: (3) [globbing] error: bad range specification after pos 174

Eric

PS,
Also, this site seems to be okay as a place to upload your html and  
practice xpath:


http://www.whitebeam.org/library/guide/TechNotes/xpathtestbed.rhtm

I did have to trip out the namespace stuff though.




-
Eric Pugh | Principal | OpenSource Connections, LLC | 434.466.1467 | 
http://www.opensourceconnections.com
Free/Busy: http://tinyurl.com/eric-cal






Re: phrase query word delimiting

2009-05-20 Thread Otis Gospodnetic

Alex,

You might want to paste in your tokenizer/token filter config.  You may also 
want to paste in how you analyzer configuration breaks those phrases and what 
the position of each term is.  This will make it easier for others to 
understand what you have, what doesn't work, and what your options are.

Otis



- Original Message 
 From: Alex Life av_l...@yahoo.com
 To: solr-user@lucene.apache.org
 Sent: Wednesday, May 20, 2009 1:56:36 PM
 Subject: phrase query  word delimiting
 
 
 Hi All,
 
 Could you please help?
 
 I have following document
 Super PowerShot SD
 
 I want this document to be found by phrase queries below (both):
 super powershot sd
 super power shot sd
 
 Is this possible without sloppy phrase query? (at least theoretical)
 I don't see any way setting term/positions so both queries would be 
 successful 
 :-(
 
 Thanks
 Alex



Re: dataimport.properties; configure writable location?

2009-05-20 Thread Shalin Shekhar Mangar
On Wed, May 20, 2009 at 11:44 PM, Wesley Small wesley.sm...@mtvstaff.comwrote:

 Is a place in a core's solrconfig, where one can set the directory/path
 where the dataimport.properties file is written to?


It is not configurable right now. Can you please open a jira issue for this?

-- 
Regards,
Shalin Shekhar Mangar.


Re: Issue with AND/OR Operator in Dismax Request

2009-05-20 Thread Doug Steigerwald

http://issues.apache.org/jira/browse/SOLR-405 ?

It's quite old and it's exactly what you want, but I think it might be  
the JIRA ticket that Otis mentioned.  Using a filter query was what we  
really needed.  I'm also not really sure why you need a dismax query  
at all.  You're not querying for the same thing in multiple fields.


Doug

On May 20, 2009, at 1:18 PM, dabboo wrote:



Hi,

Yeah you are right. Can you please tell me the URL of JIRA.

Thanks,
Amit



Otis Gospodnetic wrote:



Amit,

That's the same question as the other day, right?
Yes, DisMax doesn't play well with Boolean operators.  Check JIRA,  
it has

a search box, so you may be able to find related patches.
I think the patch I was thinking about is actually for something  
else -
allowing field names to be specified in query string and DixMax  
handling

that correctly.

Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch



- Original Message 

From: dabboo ag...@sapient.com
To: solr-user@lucene.apache.org
Sent: Wednesday, May 20, 2009 1:35:00 AM
Subject: Issue with AND/OR Operator in Dismax Request


Hi,

I am not getting correct results with a Query which has multiple  
AND | OR

operator.

Query Format q=((A AND B) OR (C OR D) OR E)

?q=((intAgeFrom_product_i:[0+TO+3]+AND+intAgeTo_product_i:[3+TO+*]) 
+OR+(intAgeFrom_product_i:[0+TO+3]+AND+intAgeTo_product_i:[0+TO+3]) 
+OR+(ageFrom_product_s:Adult))qt=dismaxrequest



Query return correct result without Dismaxrequest, but incorrect  
results

with Dismaxrequest.

I have to use dismaxrequest because i need boosting of search  
results


According to some posts there are issues with AND | OR operator with
dismaxrequest.
Please let me know if anyone has faced the same problem and if  
there is

any
way to make the query work with dismaxrequest.

I also believe that there is some patch available for this in one  
of the
JIRA. I would appreciate if somebody can let me know the URL, so  
that I

can
take a look at the patch.

Thanks for the help.

Thanks,
Amit Garg
--
View this message in context:
http://www.nabble.com/Issue-with-AND-OR-Operator-in-Dismax-Request-tp23629269p23629269.html
Sent from the Solr - User mailing list archive at Nabble.com.






--
View this message in context: 
http://www.nabble.com/Issue-with-AND-OR-Operator-in-Dismax-Request-tp23629269p23639786.html
Sent from the Solr - User mailing list archive at Nabble.com.





Re: Plugin Not Found

2009-05-20 Thread Noble Paul നോബിള്‍ नोब्ळ्
what else is there in the solr.home/lib other than this component?

On Wed, May 20, 2009 at 9:08 PM, Jeff Newburn jnewb...@zappos.com wrote:
 I tried to change the package name to com.zappos.solr.

 When I declared the search component with:
 searchComponent name=facetcube
 class=com.zappos.solr.FacetCubeComponent/

 I get:
 SEVERE: org.apache.solr.common.SolrException: Unknown Search Component:
 facetcube
    at org.apache.solr.core.SolrCore.getSearchComponent(SolrCore.java:874)
    at
 org.apache.solr.handler.component.SearchHandler.inform(SearchHandler.java:12
 7)
    at


 When I declare the component with solr.FacetCubeComponent I get the same
 error message.

 When we turned on trace we got the same exception plus
 Caused by: java.lang.ClassNotFoundException:
 com.zappos.solr.FacetCubeComponent
    at
 org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.jav
 a:1360)
    at
 org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.jav
 a:1206)
    at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:247)
    at
 org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:29
 4)
    ... 27 more



 --
 Jeff Newburn
 Software Engineer, Zappos.com
 jnewb...@zappos.com - 702-943-7562


 From: Grant Ingersoll gsing...@apache.org
 Reply-To: solr-user@lucene.apache.org
 Date: Wed, 20 May 2009 10:38:30 -0400
 To: solr-user@lucene.apache.org
 Subject: Re: Plugin Not Found

 Just a wild guess here, but...

 Try doing one of two things:
 1. change the package name to be something other than o.a.s
 2. Change your config to use solr.FacetCubeComponent

 You might also try turning on trace level logging for the
 SolrResourceLoader and report back the output.

 -Grant

 On May 20, 2009, at 10:20 AM, Jeff Newburn wrote:

 Error is below. This error does not appear when I manually copy the
 jar file
 into the tomcat webapp directory only when I try to put it in the
 solr.home
 lib directory.

 SEVERE: org.apache.solr.common.SolrException: Error loading class
 'org.apache.solr.handler.component.FacetCubeComponent'
    at
 org
 .apache
 .solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:31
 0)
    at
 org
 .apache
 .solr.core.SolrResourceLoader.newInstance(SolrResourceLoader.java:
 325)
    at
 org
 .apache
 .solr.util.plugin.AbstractPluginLoader.create(AbstractPluginLoader
 .java:84)
    at
 org
 .apache
 .solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.j
 ava:141)
    at
 org.apache.solr.core.SolrCore.loadSearchComponents(SolrCore.java:841)
    at org.apache.solr.core.SolrCore.init(SolrCore.java:528)
    at org.apache.solr.core.CoreContainer.create(CoreContainer.java:
 350)
    at org.apache.solr.core.CoreContainer.load(CoreContainer.java:227)
    at
 org.apache.solr.core.CoreContainer
 $Initializer.initialize(CoreContainer.java
 :107)
    at
 org
 .apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:
 69)
    at
 org
 .apache
 .catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilter
 Config.java:275)
    at
 org
 .apache
 .catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFil
 terConfig.java:397)
    at
 org
 .apache
 .catalina.core.ApplicationFilterConfig.init(ApplicationFilterCon
 fig.java:108)
    at
 org
 .apache
 .catalina.core.StandardContext.filterStart(StandardContext.java:37
 09)
    at
 org.apache.catalina.core.StandardContext.start(StandardContext.java:
 4356)
    at
 org
 .apache
 .catalina.core.ContainerBase.addChildInternal(ContainerBase.java:7
 91)
    at
 org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:
 771)
    at
 org.apache.catalina.core.StandardHost.addChild(StandardHost.java:525)
    at
 org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:829)
    at
 org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:718)
    at
 org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:490)
    at org.apache.catalina.startup.HostConfig.start(HostConfig.java:
 1147)
    at
 org
 .apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:
 311)
    at
 org
 .apache
 .catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSuppor
 t.java:117)
    at
 org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1053)
    at org.apache.catalina.core.StandardHost.start(StandardHost.java:
 719)
    at
 org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
    at
 org.apache.catalina.core.StandardEngine.start(StandardEngine.java:443)
    at
 org.apache.catalina.core.StandardService.start(StandardService.java:
 516)
    at
 org.apache.catalina.core.StandardServer.start(StandardServer.java:710)
    at org.apache.catalina.startup.Catalina.start(Catalina.java:578)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
 sun
 .reflect
 .NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39