Issues With Solr Cloud Setup
Hi, I was going through solr cloud documentation ( http://wiki.apache.org/solr/SolrCloud). and was trying out a simple setup. I could execute example A and B successfully. However whn i try example C, i get the following exception: 2010 12:22:29 PM org.apache.log4j.Category warn WARNING: Exception closing session 0x0 to sun.nio.ch.selectionkeyi...@1385660 java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:574) at org.apache.zookeeper.ClientCnxn$SendThread.run(zookeeper:ClientCnxn.java):933) May 14, 2010 12:22:29 PM org.apache.log4j.Category warn WARNING: Ignoring exception during shutdown input java.nio.channels.ClosedChannelException at sun.nio.ch.SocketChannelImpl.shutdownInput(SocketChannelImpl.java:638) at sun.nio.ch.SocketAdaptor.shutdownInput(SocketAdaptor.java:360) at org.apache.zookeeper.ClientCnxn$SendThread.cleanup(zookeeper:ClientCnxn.java):999) at org.apache.zookeeper.ClientCnxn$SendThread.run(zookeeper:ClientCnxn.java):970) May 14, 2010 12:22:29 PM org.apache.log4j.Category warn Probably this exception is because i hv not set localhost in my etc/hosts (still wonder how the 1st two worked). I tried replacing localhost with my domain name (java -Dbootstrap_confdir=./solr/conf -Dcollection.configName=myconf -DzkRun -DzkHost=raakhi:9983,raakhi:8574,raakhi:9900 -jar start.jar) and i get the following exception: java.lang.IllegalArgumentException: solr/zoo_data/myid file is missing at org.apache.solr.cloud.SolrZkServerProps.parseProperties(SolrZkServer.java:453) at org.apache.solr.cloud.SolrZkServer.parseConfig(SolrZkServer.java:83) at org.apache.solr.core.CoreContainer.initZooKeeper(CoreContainer.java:109) at org.apache.solr.core.CoreContainer.load(CoreContainer.java:344) at org.apache.solr.core.CoreContainer.load(CoreContainer.java:298) at org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:213) at org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:88) at org.mortbay.jetty.servlet.FilterHolder.doStart(FilterHolder.java:99) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:594) at org.mortbay.jetty.servlet.Context.startContext(Context.java:139) at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1218) at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:500) at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:448) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:147) at org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:161) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:147) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:117) at org.mortbay.jetty.Server.doStart(Server.java:210) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.xml.XmlConfiguration.main(XmlConfiguration.java:929) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.mortbay.start.Main.invokeMain(Main.java:183) at org.mortbay.start.Main.start(Main.java:497) at org.mortbay.start.Main.main(Main.java:115) Am i going wrong somewhere? Regards, Raakhi
Facet Queries
Hi, whn i use facet queries, whats the default size of the results returned? how do we configure if we want all the results shown? Regards Raakhi
Re: Facet Queries
Hi, Thanks a lot...had a look @ tht... it solved my problem Thanks once again Regards Raakhi On Fri, May 14, 2010 at 2:13 PM, Leonardo Menezes leonardo.menez...@googlemail.com wrote: Hey, there´s plenty of documentation about that... http://wiki.apache.org/solr/SimpleFacetParameters#Field_Value_Faceting_Parameters On Fri, May 14, 2010 at 10:38 AM, Rakhi Khatwani rkhatw...@gmail.com wrote: Hi, whn i use facet queries, whats the default size of the results returned? how do we configure if we want all the results shown? Regards Raakhi
Issues with clustering in multicore
Hi, I was trying out a clustering example. which worked out as mentioned in the document. Now, I want to use the clustering feature in my multicore where i have my core indexes saved. so i edit the solrconfig.xml in tht file to add clustering information (i did make sure that the lib declaration points to the correct location). but when i restart the solrserver for multicore, i get the following exception May 17, 2010 7:17:41 PM org.apache.solr.common.SolrException log SEVERE: org.apache.solr.common.SolrException: Error loading class 'org.apache.solr.handler.clustering.ClusteringComponent' at org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:373) at org.apache.solr.core.SolrCore.createInstance(SolrCore.java:413) at org.apache.solr.core.SolrCore.createInitInstance(SolrCore.java:435) at org.apache.solr.core.SolrCore.initPlugins(SolrCore.java:1498) at org.apache.solr.core.SolrCore.initPlugins(SolrCore.java:1492) at org.apache.solr.core.SolrCore.initPlugins(SolrCore.java:1525) at org.apache.solr.core.SolrCore.loadSearchComponents(SolrCore.java:833) at org.apache.solr.core.SolrCore.init(SolrCore.java:551) at org.apache.solr.core.CoreContainer.create(CoreContainer.java:428) at org.apache.solr.core.CoreContainer.load(CoreContainer.java:278) at org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:117) at org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:83) at org.mortbay.jetty.servlet.FilterHolder.doStart(FilterHolder.java:99) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:594) at org.mortbay.jetty.servlet.Context.startContext(Context.java:139) at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1218) at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:500) at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:448) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:147) at org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:161) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:147) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:117) at org.mortbay.jetty.Server.doStart(Server.java:210) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.xml.XmlConfiguration.main(XmlConfiguration.java:929) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.mortbay.start.Main.invokeMain(Main.java:183) at org.mortbay.start.Main.start(Main.java:497) at org.mortbay.start.Main.main(Main.java:115) Caused by: java.lang.ClassNotFoundException: org.apache.solr.handler.clustering.ClusteringComponent at java.net.URLClassLoader$1.run(URLClassLoader.java:200) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:188) at java.lang.ClassLoader.loadClass(ClassLoader.java:307) at java.net.FactoryURLClassLoader.loadClass(URLClassLoader.java:592) at java.lang.ClassLoader.loadClass(ClassLoader.java:252) at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:247) at org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:357) ... 35 more Any pointers, RegardS, Raakhi
Using solrJ to get all fields in a particular schema/index
Hi, Is there any way to get all the fields (irrespective of whether it contains a value or null) in solrDocument. or Is there any way to get all the fields in schema.xml of the url link ( http://localhost:8983/solr/core0/)?? Regards, Raakhi
Re: Using solrJ to get all fields in a particular schema/index
Hi Aditya, i can retrieve all documents. but cannot retrieve all the fields in a document(if it does not hv any value). For example i get a list of documents, some of the documents have some value for title field, and others mite not contain a value for title field. in anycase i need to get the entry for title in getFieldNames(). How do i go about that? Regards, Raakhi On Tue, May 25, 2010 at 5:07 PM, findbestopensource findbestopensou...@gmail.com wrote: Resending it as there is a typo error. To reterive all documents, You need to use the query/filter FieldName:*:* . Regards Aditya www.findbestopensource.com On Tue, May 25, 2010 at 4:29 PM, findbestopensource findbestopensou...@gmail.com wrote: To reterive all documents, You need to use the query/filter *FieldName:*:* * Regards Aditya www.findbestopensource.com On Tue, May 25, 2010 at 4:14 PM, Rakhi Khatwani rkhatw...@gmail.com wrote: Hi, Is there any way to get all the fields (irrespective of whether it contains a value or null) in solrDocument. or Is there any way to get all the fields in schema.xml of the url link ( http://localhost:8983/solr/core0/)?? Regards, Raakhi
Re: Using solrJ to get all fields in a particular schema/index
Hi, Oh okay. Thanks a ton On Tue, May 25, 2010 at 11:15 PM, Chris Hostetter hossman_luc...@fucit.orgwrote: :Is there any way to get all the fields (irrespective of whether : it contains a value or null) in solrDocument. no. a document only has Field instances for the fields which it has values for. it's also not a feature that would even be theoretically posisbly to add, becuase of dynamicFields. If you have even one single dynmaicField declaration, then there is an infinite number of possible fields. : Is there any way to get all the fields in schema.xml of the url link ( : http://localhost:8983/solr/core0/)http://localhost:8983/solr/core0/%29 ?? Take a look at http://localhost:8983/solr/core0/admin/luke?show=schema ... it can programaticly return details about the schema (including all the fields and dynamicFields) to your application. -Hoss
Deleted documents appearing in facet fields
Hi, I have a schema with id as one of the fields. i index some documents (by adding and deleting some documents). when i perform faceting on all documents(q=*:*) with facet.field=id, i even get those id's for which the document is deleted for example: (025_null,026_null are deleted documents) lst name=facet_fields lst name=id int name=022_41/int int name=022_51/int int name=022_61/int int name=022_71/int int name=022_81/int int name=022_91/int int name=025_11/int int name=025_null0/int int name=026_11/int int name=026_null0/int int name=042_11/int /lst /lst I was under an impression that facets are performed on the search results. So i am not able to figure out why this is happening. Moreover is there any strategy avoiding this [other than optimizing indices (coz we cant optimize indices frequently) or setting facet.mincount=1 (coz probably we can have a different requirement where existing documents do not satisfy an existing query)], Regards, Raakhi
Re: Deleted documents appearing in facet fields
Hi Ahmet, but i use solrj to commit documents. and there is no commit method which allows you to mention expungeDeletes. BTW i am using solr 1.4 Regards, Raakhi On Thu, Jun 3, 2010 at 5:03 PM, Ahmet Arslan iori...@yahoo.com wrote: Hi, I have a schema with id as one of the fields. i index some documents (by adding and deleting some documents). when i perform faceting on all documents(q=*:*) with facet.field=id, i even get those id's for which the document is deleted for example: (025_null,026_null are deleted documents) lst name=facet_fields lst name=id int name=022_41/int int name=022_51/int int name=022_61/int int name=022_71/int int name=022_81/int int name=022_91/int int name=025_11/int int name=025_null0/int int name=026_11/int int name=026_null0/int int name=042_11/int /lst /lst I was under an impression that facets are performed on the search results. So i am not able to figure out why this is happening. Moreover is there any strategy avoiding this [other than optimizing indices (coz we cant optimize indices frequently) or setting facet.mincount=1 (coz probably we can have a different requirement where existing documents do not satisfy an existing query)], May be commit with expungeDeletes? commit expungeDeletes=true/ http://wiki.apache.org/solr/UpdateXmlMessages#Optional_attributes_for_.22commit.22_and_.22optimize.22
Re: Deleted documents appearing in facet fields
Thankyou So Much :) On Thu, Jun 3, 2010 at 6:47 PM, Ahmet Arslan iori...@yahoo.com wrote: Hi Ahmet, but i use solrj to commit documents. and there is no commit method which allows you to mention expungeDeletes. Alternatively you can do it with SolrQuery. final SolrQuery query = new SolrQuery(); query.set(qt, /update); query.set(commit, true); query.set(expungeDeletes, true); System.out.println(SolrServer.query(query)); You can verify it from /admin/stats.jsp#update e.g. expungeDeletes : 1
Re: question about the fieldCollapseCache
Hi, I tried downloading solr 1.4.1 from the site. but it shows an empty directory. where did u get solr 1.4.1 from? Regards, Raakhi On Tue, Jun 8, 2010 at 10:35 PM, Jean-Sebastien Vachon js.vac...@videotron.ca wrote: Hi All, I've been running some tests using 6 shards each one containing about 1 millions documents. Each shard is running in its own virtual machine with 7 GB of ram (5GB allocated to the JVM). After about 1100 unique queries the shards start to struggle and run out of memory. I've reduced all other caches without significant impact. When I remove completely the fieldCollapseCache, the server can keep up for hours and use only 2 GB of ram. (I'm even considering returning to a 32 bits JVM) The size of the fieldCollapseCache was set to 5000 items. How can 5000 items eat 3 GB of ram? Can someone tell me what is put in this cache? Has anyone experienced this kind of problem? I am running Solr 1.4.1 with patch 236. All requests are collapsing on a single field (pint) and collapse.maxdocs set to 200 000. Thanks for any hints...
Re: Field Collapsing SOLR-236
Hi, I wanted to try out field collapsing for a requirement. i went through the wiki and solr-236. but there are lot of patch files. and the comments below left me confused. i tried applyin the patch file on 1.4.0 release but ended up with many compile errors. i even downloaded the latest code from the repository and applied the patch(solr-trunk-236 dtd 16th May 2010). but ended up with build errors. Can someone tell me which patch file to apply on which build? so that i can get collapsing working? Regards, Raakhi. On Thu, Mar 25, 2010 at 11:15 PM, Rob Z zman...@hotmail.com wrote: What do you mean you had to revert to Trunk 1.5. Do you mean upgrade? Which version were you using before hand? Can you please list the exact version of 1.5 and the patch # you used. I downloaded the latest nightly build and tried patching using the 2/1 patch. Everything went ok but I am getting 1 failing test. Would you recommend using the latest nightly 1.5 build or 1.4 for production use? I really need this feature so I don't think I have much of a choice. Can you also explain the performance implications you are seeing AND what configuration tweaks you've used that helped. Thanks! From: mark.robe...@red-gate.com To: solr-user@lucene.apache.org Date: Thu, 25 Mar 2010 15:21:54 + Subject: RE: Field Collapsing SOLR-236 Yeah got it working fine - but I needed to revert to Trunk (1.5) to get the patch to apply. It does certainly have some performance implications, but tweaking configuration can help here. Overall the benefits very much outweigh the costs for us :) Mark. -Original Message- From: Dennis Gearon [mailto:gear...@sbcglobal.net] Sent: 25 March 2010 00:49 To: solr-user@lucene.apache.org Subject: Re: Field Collapsing SOLR-236 Boy, I hope that field collapsing works! I'm planning on using it heavily. Dennis Gearon Signature Warning EARTH has a Right To Life, otherwise we all die. Read 'Hot, Flat, and Crowded' Laugh at http://www.yert.com/film.php --- On Wed, 3/24/10, blargy zman...@hotmail.com wrote: From: blargy zman...@hotmail.com Subject: Field Collapsing SOLR-236 To: solr-user@lucene.apache.org Date: Wednesday, March 24, 2010, 12:17 PM Has anyone had any luck with the field collapsing patch (SOLR-236) with Solr 1.4? I tried patching my version of 1.4 with no such luck. Thanks -- View this message in context: http://old.nabble.com/Field-Collapsing-SOLR-236-tp28019949p28019949.html Sent from the Solr - User mailing list archive at Nabble.com. _ Hotmail: Trusted email with Microsoft’s powerful SPAM protection. http://clk.atdmt.com/GBL/go/210850552/direct/01/
Re: Field Collapsing SOLR-236
know how to do it I will go ohh... I can't believe I couldn't figure that out) - Moazzam On Wed, Jun 16, 2010 at 8:25 AM, Moazzam Khan moazz...@gmail.com wrote: Hi Rakhi, You are supposed to get the code for solr 1.4 from SVN here: http:/svn.apache.org/repos/asf/lucene/solr/tags/ Then apply the path to it and comppile. It should work. However, you will probably get an error at run time saying some java class is missing. I haven't been able to figure out what to do after that. - moazzam http://moazzam-khan.com On Wed, Jun 16, 2010 at 3:37 AM, Rakhi Khatwani rkhatw...@gmail.com wrote: Hi, I wanted to try out field collapsing for a requirement. i went through the wiki and solr-236. but there are lot of patch files. and the comments below left me confused. i tried applyin the patch file on 1.4.0 release but ended up with many compile errors. i even downloaded the latest code from the repository and applied the patch(solr-trunk-236 dtd 16th May 2010). but ended up with build errors. Can someone tell me which patch file to apply on which build? so that i can get collapsing working? Regards, Raakhi. On Thu, Mar 25, 2010 at 11:15 PM, Rob Z zman...@hotmail.com wrote: What do you mean you had to revert to Trunk 1.5. Do you mean upgrade? Which version were you using before hand? Can you please list the exact version of 1.5 and the patch # you used. I downloaded the latest nightly build and tried patching using the 2/1 patch. Everything went ok but I am getting 1 failing test. Would you recommend using the latest nightly 1.5 build or 1.4 for production use? I really need this feature so I don't think I have much of a choice. Can you also explain the performance implications you are seeing AND what configuration tweaks you've used that helped. Thanks! From: mark.robe...@red-gate.com To: solr-user@lucene.apache.org Date: Thu, 25 Mar 2010 15:21:54 + Subject: RE: Field Collapsing SOLR-236 Yeah got it working fine - but I needed to revert to Trunk (1.5) to get the patch to apply. It does certainly have some performance implications, but tweaking configuration can help here. Overall the benefits very much outweigh the costs for us :) Mark. -Original Message- From: Dennis Gearon [mailto:gear...@sbcglobal.net] Sent: 25 March 2010 00:49 To: solr-user@lucene.apache.org Subject: Re: Field Collapsing SOLR-236 Boy, I hope that field collapsing works! I'm planning on using it heavily. Dennis Gearon Signature Warning EARTH has a Right To Life, otherwise we all die. Read 'Hot, Flat, and Crowded' Laugh at http://www.yert.com/film.php --- On Wed, 3/24/10, blargy zman...@hotmail.com wrote: From: blargy zman...@hotmail.com Subject: Field Collapsing SOLR-236 To: solr-user@lucene.apache.org Date: Wednesday, March 24, 2010, 12:17 PM Has anyone had any luck with the field collapsing patch (SOLR-236) with Solr 1.4? I tried patching my version of 1.4 with no such luck. Thanks -- View this message in context: http://old.nabble.com/Field-Collapsing-SOLR-236-tp28019949p28019949.html Sent from the Solr - User mailing list archive at Nabble.com. _ Hotmail: Trusted email with Microsoft’s powerful SPAM protection. http://clk.atdmt.com/GBL/go/210850552/direct/01/
Re: Field Collapsing SOLR-236
Hi Moazzam, Where did u get the src code from?? I am downloading it from https://svn.apache.org/repos/asf/lucene/solr/branches/branch-1.4 and the latest revision in this location is 955469. so applying the latest patch(dated 17th june 2010) on it still generates errors. Any Pointers? Regards, Raakhi On Fri, Jun 18, 2010 at 1:24 AM, Moazzam Khan moazz...@gmail.com wrote: I knew it wasn't me! :) I found the patch just before I read this and applied it to the trunk and it works! Thanks Mark and martijn for all your help! - Moazzam On Thu, Jun 17, 2010 at 2:16 PM, Martijn v Groningen martijn.is.h...@gmail.com wrote: I've added a new patch to the issue, so building the trunk (rev 955615) with the latest patch should not be a problem. Due to recent changes in the Lucene trunk the patch was not compatible. On 17 June 2010 20:20, Erik Hatcher erik.hatc...@gmail.com wrote: On Jun 16, 2010, at 7:31 PM, Mark Diggory wrote: p.s. I'd be glad to contribute our Maven build re-organization back to the community to get Solr properly Mavenized so that it can be distributed and released more often. For us the benefit of this structure is that we will be able to overlay addons such as RequestHandlers and other third party support without having to rebuild Solr from scratch. But you don't have to rebuild Solr from scratch to add a new request handler or other plugins - simply compile your custom stuff into a JAR and put it in solr-home/lib (or point to it with lib in solrconfig.xml). Ideally, a Maven Archetype could be created that would allow one rapidly produce a Solr webapp and fire it up in Jetty in mere seconds. How's that any different than cd example; java -jar start.jar? Or do you mean a Solr client webapp? Finally, with projects such as Bobo, integration with Spring would make configuration more consistent and request significantly less java coding just to add new capabilities everytime someone authors a new RequestHandler. It's one line of config to add a new request handler. How many ridiculously ugly confusing lines of Spring XML would it take? The biggest thing I learned about Solr in my work thusfar is that patches like these could be standalone modules in separate projects if it weren't for having to hack the configuration and solrj methods up to adopt them. Which brings me to SolrJ, great API if it would stay generic and have less concern for adding method each time some custom collections and query support for morelikethis or collapseddocs needs to be added. I personally find it silly that we customize SolrJ for all these request handlers anyway. You get a decent navigable data structure back from general SolrJ query requests as it is, there's no need to build in all these convenience methods specific to all the Solr componetry. Sure, it's convenient, but it's a maintenance headache and as you say, not generic. But hacking configuration is reasonable, I think, for adding in plugins. I guess you're aiming for some kind of Spring-like auto-discovery of plugins? Yeah, maybe, but I'm pretty -1 on Spring coming into Solr. It's overkill and ugly, IMO. But you like it :) And that's cool by me, to each their own. Oh, and Hi Mark! :) Erik -- Met vriendelijke groet, Martijn van Groningen
Re: Field Collapsing SOLR-236
Hi, Oh in that case is the code stable enough to use it for production? Does it support features which solr 1.4 normally supports? I am using facets as a workaround but then i am not able to sort on any other field. is there any workaround to support this feature?? Regards, Raakhi On Fri, Jun 18, 2010 at 6:14 PM, Martijn v Groningen martijn.is.h...@gmail.com wrote: Hi Rakhi, The patch is not compatible with 1.4. If you want to work with the trunk. I'll need to get the src from https://svn.apache.org/repos/asf/lucene/dev/trunk/ Martijn On 18 June 2010 13:46, Rakhi Khatwani rkhatw...@gmail.com wrote: Hi Moazzam, Where did u get the src code from?? I am downloading it from https://svn.apache.org/repos/asf/lucene/solr/branches/branch-1.4 and the latest revision in this location is 955469. so applying the latest patch(dated 17th june 2010) on it still generates errors. Any Pointers? Regards, Raakhi On Fri, Jun 18, 2010 at 1:24 AM, Moazzam Khan moazz...@gmail.com wrote: I knew it wasn't me! :) I found the patch just before I read this and applied it to the trunk and it works! Thanks Mark and martijn for all your help! - Moazzam On Thu, Jun 17, 2010 at 2:16 PM, Martijn v Groningen martijn.is.h...@gmail.com wrote: I've added a new patch to the issue, so building the trunk (rev 955615) with the latest patch should not be a problem. Due to recent changes in the Lucene trunk the patch was not compatible. On 17 June 2010 20:20, Erik Hatcher erik.hatc...@gmail.com wrote: On Jun 16, 2010, at 7:31 PM, Mark Diggory wrote: p.s. I'd be glad to contribute our Maven build re-organization back to the community to get Solr properly Mavenized so that it can be distributed and released more often. For us the benefit of this structure is that we will be able to overlay addons such as RequestHandlers and other third party support without having to rebuild Solr from scratch. But you don't have to rebuild Solr from scratch to add a new request handler or other plugins - simply compile your custom stuff into a JAR and put it in solr-home/lib (or point to it with lib in solrconfig.xml). Ideally, a Maven Archetype could be created that would allow one rapidly produce a Solr webapp and fire it up in Jetty in mere seconds. How's that any different than cd example; java -jar start.jar? Or do you mean a Solr client webapp? Finally, with projects such as Bobo, integration with Spring would make configuration more consistent and request significantly less java coding just to add new capabilities everytime someone authors a new RequestHandler. It's one line of config to add a new request handler. How many ridiculously ugly confusing lines of Spring XML would it take? The biggest thing I learned about Solr in my work thusfar is that patches like these could be standalone modules in separate projects if it weren't for having to hack the configuration and solrj methods up to adopt them. Which brings me to SolrJ, great API if it would stay generic and have less concern for adding method each time some custom collections and query support for morelikethis or collapseddocs needs to be added. I personally find it silly that we customize SolrJ for all these request handlers anyway. You get a decent navigable data structure back from general SolrJ query requests as it is, there's no need to build in all these convenience methods specific to all the Solr componetry. Sure, it's convenient, but it's a maintenance headache and as you say, not generic. But hacking configuration is reasonable, I think, for adding in plugins. I guess you're aiming for some kind of Spring-like auto-discovery of plugins? Yeah, maybe, but I'm pretty -1 on Spring coming into Solr. It's overkill and ugly, IMO. But you like it :) And that's cool by me, to each their own. Oh, and Hi Mark! :) Erik -- Met vriendelijke groet, Martijn van Groningen
Alternative for field collapsing
Hi, I have an index with the following fields: id (unique) title description price. Suppose i want to find unique documents and count of all documents with the same title, sorted on price. How do i go about it. Knowing that field collapsing is not stable with 1.4. if i go about using facet's on id, it sorts either on id or on the count, but not on the price, Any Suggestions?? Regards, Raakhi
Re: Alternative for field collapsing
Hi, I wanted to apply field collapsing on the title(type string). but want to show only one document (and the count of such documents) per title rather than show all the documents. Regards Raakhi On Tue, Jun 22, 2010 at 12:59 AM, Peter Karich peat...@yahoo.de wrote: Hi Raakhi, First, field collapsing works pretty well in our system. And, as Martin has said on 17.06.2010 in the other thread Field Collapsing SOLR-236: I've added a new patch to the issue, so building the trunk (rev 955615) with the latest patch should not be a problem. Due to recent changes in the Lucene trunk the patch was not compatible. Second, if the id is unique applying field collapse make no sense. So I suppose you will apply field collapsing to the title, right? But in this case, why doesn't a simple query ala q=title:'my title'sort=price asc work for you? Or what do you want to achieve? (The title should be of type string, I think) Regards, Peter. Hi, I have an index with the following fields: id (unique) title description price. Suppose i want to find unique documents and count of all documents with the same title, sorted on price. How do i go about it. Knowing that field collapsing is not stable with 1.4. if i go about using facet's on id, it sorts either on id or on the count, but not on the price, Any Suggestions?? Regards, Raakhi
Re: Alternative for field collapsing
Thanks Peter :) On Tue, Jun 22, 2010 at 3:08 PM, Peter Karich peat...@yahoo.de wrote: ups, sorry. I meant Martijn! Not the germanized Martin :-/ Peter. Hi, I wanted to apply field collapsing on the title(type string). but want to show only one document (and the count of such documents) per title rather than show all the documents. Regards Raakhi On Tue, Jun 22, 2010 at 12:59 AM, Peter Karich peat...@yahoo.de wrote: Hi Raakhi, First, field collapsing works pretty well in our system. And, as Martin has said on 17.06.2010 in the other thread Field Collapsing SOLR-236: I've added a new patch to the issue, so building the trunk (rev 955615) with the latest patch should not be a problem. Due to recent changes in the Lucene trunk the patch was not compatible. Second, if the id is unique applying field collapse make no sense. So I suppose you will apply field collapsing to the title, right? But in this case, why doesn't a simple query ala q=title:'my title'sort=price asc work for you? Or what do you want to achieve? (The title should be of type string, I think) Regards, Peter. Hi, I have an index with the following fields: id (unique) title description price. Suppose i want to find unique documents and count of all documents with the same title, sorted on price. How do i go about it. Knowing that field collapsing is not stable with 1.4. if i go about using facet's on id, it sorts either on id or on the count, but not on the price, Any Suggestions?? Regards, Raakhi
Re: Field Collapsing SOLR-236
Hi, I tried checking out the latest code (rev 956715) the patch did not work on it. Infact i even tried hunting for the revision mentioned earlier in this thread (i.e. rev 955615) but cannot find it in the repository. (it has revision 955569 followed by revision 955785). Any pointers?? Regards Raakhi On Tue, Jun 22, 2010 at 2:03 AM, Martijn v Groningen martijn.is.h...@gmail.com wrote: Oh in that case is the code stable enough to use it for production? - Well this feature is a patch and I think that says it all. Although bugs are fixed it is deferentially an experimental feature and people should keep that in mind when using one of the patches. Does it support features which solr 1.4 normally supports? - As far as I know yes. am using facets as a workaround but then i am not able to sort on any other field. is there any workaround to support this feature?? - Maybee http://wiki.apache.org/solr/Deduplication prevents from adding duplicates in you index, but then you miss the collapse counts and other computed values On 21 June 2010 09:04, Rakhi Khatwani rkhatw...@gmail.com wrote: Hi, Oh in that case is the code stable enough to use it for production? Does it support features which solr 1.4 normally supports? I am using facets as a workaround but then i am not able to sort on any other field. is there any workaround to support this feature?? Regards, Raakhi On Fri, Jun 18, 2010 at 6:14 PM, Martijn v Groningen martijn.is.h...@gmail.com wrote: Hi Rakhi, The patch is not compatible with 1.4. If you want to work with the trunk. I'll need to get the src from https://svn.apache.org/repos/asf/lucene/dev/trunk/ Martijn On 18 June 2010 13:46, Rakhi Khatwani rkhatw...@gmail.com wrote: Hi Moazzam, Where did u get the src code from?? I am downloading it from https://svn.apache.org/repos/asf/lucene/solr/branches/branch-1.4 and the latest revision in this location is 955469. so applying the latest patch(dated 17th june 2010) on it still generates errors. Any Pointers? Regards, Raakhi On Fri, Jun 18, 2010 at 1:24 AM, Moazzam Khan moazz...@gmail.com wrote: I knew it wasn't me! :) I found the patch just before I read this and applied it to the trunk and it works! Thanks Mark and martijn for all your help! - Moazzam On Thu, Jun 17, 2010 at 2:16 PM, Martijn v Groningen martijn.is.h...@gmail.com wrote: I've added a new patch to the issue, so building the trunk (rev 955615) with the latest patch should not be a problem. Due to recent changes in the Lucene trunk the patch was not compatible. On 17 June 2010 20:20, Erik Hatcher erik.hatc...@gmail.com wrote: On Jun 16, 2010, at 7:31 PM, Mark Diggory wrote: p.s. I'd be glad to contribute our Maven build re-organization back to the community to get Solr properly Mavenized so that it can be distributed and released more often. For us the benefit of this structure is that we will be able to overlay addons such as RequestHandlers and other third party support without having to rebuild Solr from scratch. But you don't have to rebuild Solr from scratch to add a new request handler or other plugins - simply compile your custom stuff into a JAR and put it in solr-home/lib (or point to it with lib in solrconfig.xml). Ideally, a Maven Archetype could be created that would allow one rapidly produce a Solr webapp and fire it up in Jetty in mere seconds. How's that any different than cd example; java -jar start.jar? Or do you mean a Solr client webapp? Finally, with projects such as Bobo, integration with Spring would make configuration more consistent and request significantly less java coding just to add new capabilities everytime someone authors a new RequestHandler. It's one line of config to add a new request handler. How many ridiculously ugly confusing lines of Spring XML would it take? The biggest thing I learned about Solr in my work thusfar is that patches like these could be standalone modules in separate projects if it weren't for having to hack the configuration and solrj methods up to adopt them. Which brings me to SolrJ, great API if it would stay generic and have less concern for adding method each time some custom collections and query support for morelikethis or collapseddocs needs to be added. I personally find it silly that we customize SolrJ for all these request handlers anyway. You get a decent navigable data structure back from general SolrJ query requests as it is, there's no need to build in all these convenience methods specific to all the Solr componetry. Sure, it's convenient, but it's a maintenance headache and as you say
Re: Field Collapsing SOLR-236
Hi, Patching did work. but when i build the trunk, i get the following exception: [SolrTrunk]# ant compile Buildfile: /testWorkspace/SolrTrunk/build.xml init-forrest-entities: [mkdir] Created dir: /testWorkspace/SolrTrunk/build [mkdir] Created dir: /testWorkspace/SolrTrunk/build/web compile-lucene: BUILD FAILED /testWorkspace/SolrTrunk/common-build.xml:207: /testWorkspace/modules/analysis/common does not exist. Regards, Raakhi On Wed, Jun 23, 2010 at 2:39 AM, Martijn v Groningen martijn.is.h...@gmail.com wrote: What exactly did not work? Patching, compiling or running it? On 22 June 2010 16:06, Rakhi Khatwani rkhatw...@gmail.com wrote: Hi, I tried checking out the latest code (rev 956715) the patch did not work on it. Infact i even tried hunting for the revision mentioned earlier in this thread (i.e. rev 955615) but cannot find it in the repository. (it has revision 955569 followed by revision 955785). Any pointers?? Regards Raakhi On Tue, Jun 22, 2010 at 2:03 AM, Martijn v Groningen martijn.is.h...@gmail.com wrote: Oh in that case is the code stable enough to use it for production? - Well this feature is a patch and I think that says it all. Although bugs are fixed it is deferentially an experimental feature and people should keep that in mind when using one of the patches. Does it support features which solr 1.4 normally supports? - As far as I know yes. am using facets as a workaround but then i am not able to sort on any other field. is there any workaround to support this feature?? - Maybee http://wiki.apache.org/solr/Deduplication prevents from adding duplicates in you index, but then you miss the collapse counts and other computed values On 21 June 2010 09:04, Rakhi Khatwani rkhatw...@gmail.com wrote: Hi, Oh in that case is the code stable enough to use it for production? Does it support features which solr 1.4 normally supports? I am using facets as a workaround but then i am not able to sort on any other field. is there any workaround to support this feature?? Regards, Raakhi On Fri, Jun 18, 2010 at 6:14 PM, Martijn v Groningen martijn.is.h...@gmail.com wrote: Hi Rakhi, The patch is not compatible with 1.4. If you want to work with the trunk. I'll need to get the src from https://svn.apache.org/repos/asf/lucene/dev/trunk/ Martijn On 18 June 2010 13:46, Rakhi Khatwani rkhatw...@gmail.com wrote: Hi Moazzam, Where did u get the src code from?? I am downloading it from https://svn.apache.org/repos/asf/lucene/solr/branches/branch-1.4 and the latest revision in this location is 955469. so applying the latest patch(dated 17th june 2010) on it still generates errors. Any Pointers? Regards, Raakhi On Fri, Jun 18, 2010 at 1:24 AM, Moazzam Khan moazz...@gmail.com wrote: I knew it wasn't me! :) I found the patch just before I read this and applied it to the trunk and it works! Thanks Mark and martijn for all your help! - Moazzam On Thu, Jun 17, 2010 at 2:16 PM, Martijn v Groningen martijn.is.h...@gmail.com wrote: I've added a new patch to the issue, so building the trunk (rev 955615) with the latest patch should not be a problem. Due to recent changes in the Lucene trunk the patch was not compatible. On 17 June 2010 20:20, Erik Hatcher erik.hatc...@gmail.com wrote: On Jun 16, 2010, at 7:31 PM, Mark Diggory wrote: p.s. I'd be glad to contribute our Maven build re-organization back to the community to get Solr properly Mavenized so that it can be distributed and released more often. For us the benefit of this structure is that we will be able to overlay addons such as RequestHandlers and other third party support without having to rebuild Solr from scratch. But you don't have to rebuild Solr from scratch to add a new request handler or other plugins - simply compile your custom stuff into a JAR and put it in solr-home/lib (or point to it with lib in solrconfig.xml). Ideally, a Maven Archetype could be created that would allow one rapidly produce a Solr webapp and fire it up in Jetty in mere seconds. How's that any different than cd example; java -jar start.jar? Or do you mean a Solr client webapp? Finally, with projects such as Bobo, integration with Spring would make configuration more consistent and request significantly less java coding just to add new capabilities everytime someone authors a new RequestHandler. It's one line of config to add a new request handler. How many ridiculously ugly confusing lines of Spring XML would it take? The biggest thing I learned
Re: Field Collapsing SOLR-236
Oops this is probably i didn't checkout the modules file from the trunk. doing that right now :) Regards Raakhi On Wed, Jun 23, 2010 at 1:12 PM, Rakhi Khatwani rkhatw...@gmail.com wrote: Hi, Patching did work. but when i build the trunk, i get the following exception: [SolrTrunk]# ant compile Buildfile: /testWorkspace/SolrTrunk/build.xml init-forrest-entities: [mkdir] Created dir: /testWorkspace/SolrTrunk/build [mkdir] Created dir: /testWorkspace/SolrTrunk/build/web compile-lucene: BUILD FAILED /testWorkspace/SolrTrunk/common-build.xml:207: /testWorkspace/modules/analysis/common does not exist. Regards, Raakhi On Wed, Jun 23, 2010 at 2:39 AM, Martijn v Groningen martijn.is.h...@gmail.com wrote: What exactly did not work? Patching, compiling or running it? On 22 June 2010 16:06, Rakhi Khatwani rkhatw...@gmail.com wrote: Hi, I tried checking out the latest code (rev 956715) the patch did not work on it. Infact i even tried hunting for the revision mentioned earlier in this thread (i.e. rev 955615) but cannot find it in the repository. (it has revision 955569 followed by revision 955785). Any pointers?? Regards Raakhi On Tue, Jun 22, 2010 at 2:03 AM, Martijn v Groningen martijn.is.h...@gmail.com wrote: Oh in that case is the code stable enough to use it for production? - Well this feature is a patch and I think that says it all. Although bugs are fixed it is deferentially an experimental feature and people should keep that in mind when using one of the patches. Does it support features which solr 1.4 normally supports? - As far as I know yes. am using facets as a workaround but then i am not able to sort on any other field. is there any workaround to support this feature?? - Maybee http://wiki.apache.org/solr/Deduplication prevents from adding duplicates in you index, but then you miss the collapse counts and other computed values On 21 June 2010 09:04, Rakhi Khatwani rkhatw...@gmail.com wrote: Hi, Oh in that case is the code stable enough to use it for production? Does it support features which solr 1.4 normally supports? I am using facets as a workaround but then i am not able to sort on any other field. is there any workaround to support this feature?? Regards, Raakhi On Fri, Jun 18, 2010 at 6:14 PM, Martijn v Groningen martijn.is.h...@gmail.com wrote: Hi Rakhi, The patch is not compatible with 1.4. If you want to work with the trunk. I'll need to get the src from https://svn.apache.org/repos/asf/lucene/dev/trunk/ Martijn On 18 June 2010 13:46, Rakhi Khatwani rkhatw...@gmail.com wrote: Hi Moazzam, Where did u get the src code from?? I am downloading it from https://svn.apache.org/repos/asf/lucene/solr/branches/branch-1.4 and the latest revision in this location is 955469. so applying the latest patch(dated 17th june 2010) on it still generates errors. Any Pointers? Regards, Raakhi On Fri, Jun 18, 2010 at 1:24 AM, Moazzam Khan moazz...@gmail.com wrote: I knew it wasn't me! :) I found the patch just before I read this and applied it to the trunk and it works! Thanks Mark and martijn for all your help! - Moazzam On Thu, Jun 17, 2010 at 2:16 PM, Martijn v Groningen martijn.is.h...@gmail.com wrote: I've added a new patch to the issue, so building the trunk (rev 955615) with the latest patch should not be a problem. Due to recent changes in the Lucene trunk the patch was not compatible. On 17 June 2010 20:20, Erik Hatcher erik.hatc...@gmail.com wrote: On Jun 16, 2010, at 7:31 PM, Mark Diggory wrote: p.s. I'd be glad to contribute our Maven build re-organization back to the community to get Solr properly Mavenized so that it can be distributed and released more often. For us the benefit of this structure is that we will be able to overlay addons such as RequestHandlers and other third party support without having to rebuild Solr from scratch. But you don't have to rebuild Solr from scratch to add a new request handler or other plugins - simply compile your custom stuff into a JAR and put it in solr-home/lib (or point to it with lib in solrconfig.xml). Ideally, a Maven Archetype could be created that would allow one rapidly produce a Solr webapp and fire it up in Jetty in mere seconds. How's that any different than cd example; java -jar start.jar? Or do you mean a Solr client webapp? Finally, with projects such as Bobo, integration with Spring would make configuration more consistent and request significantly less java coding just to add new capabilities everytime someone authors
Re: Field Collapsing SOLR-236
Hi, But these is almost no settings in my config heres a snapshot of what i have in my solrconfig.xml config updateHandler class=solr.DirectUpdateHandler2 / requestDispatcher handleSelect=true requestParsers enableRemoteStreaming=false multipartUploadLimitInKB=2048 / /requestDispatcher requestHandler name=standard class=solr.StandardRequestHandler default=true / requestHandler name=/update class=solr.XmlUpdateRequestHandler / requestHandler name=/admin/ class=org.apache.solr.handler.admin.AdminHandlers / !-- config for the admin interface -- admin defaultQuery*:*/defaultQuery /admin !-- config for field collapsing -- searchComponent name=query class=org.apache.solr.handler.component.CollapseComponent / /config Am i goin wrong anywhere? Regards, Raakhi On Wed, Jun 23, 2010 at 3:28 PM, Govind Kanshi govind.kan...@gmail.comwrote: fieldType:analyzer without class or tokenizer filter list seems to point to the config - you may want to correct. On Wed, Jun 23, 2010 at 3:09 PM, Rakhi Khatwani rkhatw...@gmail.com wrote: Hi, I checked out modules lucene from the trunk. Performed a build using the following commands ant clean ant compile ant example Which compiled successfully. I then put my existing index(using schema.xml from solr1.4.0/conf/solr/) in the multicore folder, configured solr.xml and started the server When i type in http://localhost:8983/solr i get the following error: org.apache.solr.common.SolrException: Plugin init failure for [schema.xml] fieldType:analyzer without class or tokenizer filter list at org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:168) at org.apache.solr.schema.IndexSchema.readSchema(IndexSchema.java:480) at org.apache.solr.schema.IndexSchema.init(IndexSchema.java:122) at org.apache.solr.core.CoreContainer.create(CoreContainer.java:429) at org.apache.solr.core.CoreContainer.load(CoreContainer.java:286) at org.apache.solr.core.CoreContainer.load(CoreContainer.java:198) at org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:123) at org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:86) at org.mortbay.jetty.servlet.FilterHolder.doStart(FilterHolder.java:97) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50) at org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:662) at org.mortbay.jetty.servlet.Context.startContext(Context.java:140) at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1250) at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:517) at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:467) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:152) at org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:156) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:152) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50) at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130) at org.mortbay.jetty.Server.doStart(Server.java:224) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50) at org.mortbay.xml.XmlConfiguration.main(XmlConfiguration.java:985) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.mortbay.start.Main.invokeMain(Main.java:194) at org.mortbay.start.Main.start(Main.java:534) at org.mortbay.start.Main.start(Main.java:441) at org.mortbay.start.Main.main(Main.java:119) Caused by: org.apache.solr.common.SolrException: analyzer without class or tokenizer filter list at org.apache.solr.schema.IndexSchema.readAnalyzer(IndexSchema.java:908) at org.apache.solr.schema.IndexSchema.access$100(IndexSchema.java:60) at org.apache.solr.schema.IndexSchema$1.create(IndexSchema.java:450) at org.apache.solr.schema.IndexSchema$1.create(IndexSchema.java:435) at org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:142) ... 32 more Then i picked up an existing index (schema.xml from solr1.3/solr/conf) and put it in multicore folder, configured solr.xml and restarted my index Collapsing worked fine. Any pointers, which part of schema.xml (solr 1.4) is causing this exception? Regards, Raakhi On Wed, Jun 23, 2010 at 1:35 PM, Rakhi Khatwani rkhatw...@gmail.com wrote: Oops this is probably i didn't
Unbuffered Exception while setting permissions
Hi, I am trying out solr security on my setup from the following links: http://wiki.apache.org/solr/SolrSecurity http://www.lucidimagination.com/search/document/d1e338dc452db2e4/how_can_i_protect_the_solr_cores Following is my configuration: realms.properties: admin: admin,server-administrator,content-administrator,admin other: OBF:1xmk1w261u9r1w1c1xmq guest: guest,read-only rakhi: rakhi,RW-role jetty.xml: ... Set name=UserRealms Array type=org.mortbay.jetty.security.UserRealm Item New class=org.mortbay.jetty.security.HashUserRealm Set name=nameTest Realm/Set Set name=configSystemProperty name=jetty.home default=.//etc/realm.properties/Set /New /Item /Array /Set ... WebDefault.xml: !-- block by default. -- security-constraint web-resource-collection web-resource-nameDefault/web-resource-name url-pattern//url-pattern /web-resource-collection auth-constraint/ !-- BLOCK! -- /security-constraint !-- Setting admin access. -- security-constraint web-resource-collection web-resource-nameSolr authenticated application/web-resource-name url-pattern/admin/*/url-pattern url-pattern/core1/admin/*/url-pattern /web-resource-collection auth-constraint role-nameadmin/role-name role-nameFullAccess-role/role-name /auth-constraint /security-constraint !-- this constraint has no auth constraint or data constraint = allows without auth. -- security-constraint web-resource-collection web-resource-nameAllowedQueries/web-resource-name url-pattern/core1/select/*/url-pattern /web-resource-collection /security-constraint login-config auth-methodBASIC/auth-method realm-nameTest Realm/realm-name /login-config security-role role-nameAdmin-role/role-name /security-role security-role role-nameFullAccess-role/role-name /security-role security-role role-nameRW-role/role-name /security-role So Far Everything works good. I get a forbidden exception as soon as i try to commit documents in solr. but when i add the following security constraint tag in webdefault.xml, !-- this constraint allows access to modify the data in the SOLR service, with basic auth -- security-constraint web-resource-collection web-resource-nameRW/web-resource-name !-- the dataimport handler for each individual core -- url-pattern/core1/dataimport/url-pattern !-- the update handler (XML over HTTP) for each individual core -- url-pattern/core1/update/*/url-pattern /web-resource-collection auth-constraint !-- Roles of users are defined int the properties file -- !-- we allow users with rw-only access -- role-nameRW-role/role-name !-- we allow users with full access -- role-nameFullAccess-role/role-name /auth-constraint /security-constraint I get the following exception: org.apache.solr.client.solrj.SolrServerException: org.apache.commons.httpclient.ProtocolException: Unbuffered entity enclosing request can not be repeated. at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:469) at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:243) at org.apache.solr.client.solrj.request.AbstractUpdateRequest.process(AbstractUpdateRequest.java:105) at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:64) at Authentication.AuthenticationTest.main(AuthenticationTest.java:35) Caused by: org.apache.commons.httpclient.ProtocolException: Unbuffered entity enclosing request can not be repeated. at org.apache.commons.httpclient.methods.EntityEnclosingMethod.writeRequestBody(EntityEnclosingMethod.java:487) at org.apache.commons.httpclient.HttpMethodBase.writeRequest(HttpMethodBase.java:2114) at org.apache.commons.httpclient.HttpMethodBase.execute(HttpMethodBase.java:1096) at org.apache.commons.httpclient.HttpMethodDirector.executeWithRetry(HttpMethodDirector.java:398) at org.apache.commons.httpclient.HttpMethodDirector.executeMethod(HttpMethodDirector.java:171) at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:397) at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:323) at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:416) ... 4 more My Java code is as follows: public class AuthenticationTest { public static void main(String[] args) { try { HttpClient client = new HttpClient(); AuthScope scope = new AuthScope(AuthScope.ANY_HOST,AuthScope.ANY_PORT); client.getState().setCredentials(scope, new UsernamePasswordCredentials(rakhi,rakhi)); SolrServer server = new CommonsHttpSolrServer( http://localhost:8983/solr/core1/,client); SolrQuery query = new SolrQuery(); query.setQuery(*:*); QueryResponse response = server.query(query); System.out.println(response.getStatus()); SolrInputDocument doc = new SolrInputDocument(); doc.setField(aid, 0); doc.setField(rct, Sample Data for authentication); server.add(doc); server.commit(); } catch (MalformedURLException e) { // TODO Auto-generated catch block e.printStackTrace(); } catch (SolrServerException e) { //
Re: Unbuffered Exception while setting permissions
PS: I am using solr 1.4 Regards, Raakhi On Wed, Jun 30, 2010 at 12:05 PM, Rakhi Khatwani rkhatw...@gmail.comwrote: Hi, I am trying out solr security on my setup from the following links: http://wiki.apache.org/solr/SolrSecurity http://www.lucidimagination.com/search/document/d1e338dc452db2e4/how_can_i_protect_the_solr_cores Following is my configuration: realms.properties: admin: admin,server-administrator,content-administrator,admin other: OBF:1xmk1w261u9r1w1c1xmq guest: guest,read-only rakhi: rakhi,RW-role jetty.xml: ... Set name=UserRealms Array type=org.mortbay.jetty.security.UserRealm Item New class=org.mortbay.jetty.security.HashUserRealm Set name=nameTest Realm/Set Set name=configSystemProperty name=jetty.home default=.//etc/realm.properties/Set /New /Item /Array /Set ... WebDefault.xml: !-- block by default. -- security-constraint web-resource-collection web-resource-nameDefault/web-resource-name url-pattern//url-pattern /web-resource-collection auth-constraint/ !-- BLOCK! -- /security-constraint !-- Setting admin access. -- security-constraint web-resource-collection web-resource-nameSolr authenticated application/web-resource-name url-pattern/admin/*/url-pattern url-pattern/core1/admin/*/url-pattern /web-resource-collection auth-constraint role-nameadmin/role-name role-nameFullAccess-role/role-name /auth-constraint /security-constraint !-- this constraint has no auth constraint or data constraint = allows without auth. -- security-constraint web-resource-collection web-resource-nameAllowedQueries/web-resource-name url-pattern/core1/select/*/url-pattern /web-resource-collection /security-constraint login-config auth-methodBASIC/auth-method realm-nameTest Realm/realm-name /login-config security-role role-nameAdmin-role/role-name /security-role security-role role-nameFullAccess-role/role-name /security-role security-role role-nameRW-role/role-name /security-role So Far Everything works good. I get a forbidden exception as soon as i try to commit documents in solr. but when i add the following security constraint tag in webdefault.xml, !-- this constraint allows access to modify the data in the SOLR service, with basic auth -- security-constraint web-resource-collection web-resource-nameRW/web-resource-name !-- the dataimport handler for each individual core -- url-pattern/core1/dataimport/url-pattern !-- the update handler (XML over HTTP) for each individual core -- url-pattern/core1/update/*/url-pattern /web-resource-collection auth-constraint !-- Roles of users are defined int the properties file -- !-- we allow users with rw-only access -- role-nameRW-role/role-name !-- we allow users with full access -- role-nameFullAccess-role/role-name /auth-constraint /security-constraint I get the following exception: org.apache.solr.client.solrj.SolrServerException: org.apache.commons.httpclient.ProtocolException: Unbuffered entity enclosing request can not be repeated. at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:469) at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:243) at org.apache.solr.client.solrj.request.AbstractUpdateRequest.process(AbstractUpdateRequest.java:105) at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:64) at Authentication.AuthenticationTest.main(AuthenticationTest.java:35) Caused by: org.apache.commons.httpclient.ProtocolException: Unbuffered entity enclosing request can not be repeated. at org.apache.commons.httpclient.methods.EntityEnclosingMethod.writeRequestBody(EntityEnclosingMethod.java:487) at org.apache.commons.httpclient.HttpMethodBase.writeRequest(HttpMethodBase.java:2114) at org.apache.commons.httpclient.HttpMethodBase.execute(HttpMethodBase.java:1096) at org.apache.commons.httpclient.HttpMethodDirector.executeWithRetry(HttpMethodDirector.java:398) at org.apache.commons.httpclient.HttpMethodDirector.executeMethod(HttpMethodDirector.java:171) at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:397) at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:323) at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:416) ... 4 more My Java code is as follows: public class AuthenticationTest { public static void main(String[] args) { try { HttpClient client = new HttpClient(); AuthScope scope = new AuthScope(AuthScope.ANY_HOST,AuthScope.ANY_PORT); client.getState().setCredentials(scope, new UsernamePasswordCredentials(rakhi,rakhi)); SolrServer server = new CommonsHttpSolrServer( http://localhost:8983/solr/core1/,client); SolrQuery query = new SolrQuery(); query.setQuery(*:*); QueryResponse response = server.query(query); System.out.println(response.getStatus()); SolrInputDocument doc = new
Re: Unbuffered Exception while setting permissions
I was going through the logs, Everytime i try doing an update (and ofcourse ending up with unbuffered exception) the log outputs the following line [30/Jun/2010:09:02:52 +] POST /solr/core1/update?wt=javabinversion=1 HTTP/1.1 401 1389 Regards Raakhi On Wed, Jun 30, 2010 at 12:27 PM, Rakhi Khatwani rkhatw...@gmail.comwrote: PS: I am using solr 1.4 Regards, Raakhi On Wed, Jun 30, 2010 at 12:05 PM, Rakhi Khatwani rkhatw...@gmail.comwrote: Hi, I am trying out solr security on my setup from the following links: http://wiki.apache.org/solr/SolrSecurity http://www.lucidimagination.com/search/document/d1e338dc452db2e4/how_can_i_protect_the_solr_cores Following is my configuration: realms.properties: admin: admin,server-administrator,content-administrator,admin other: OBF:1xmk1w261u9r1w1c1xmq guest: guest,read-only rakhi: rakhi,RW-role jetty.xml: ... Set name=UserRealms Array type=org.mortbay.jetty.security.UserRealm Item New class=org.mortbay.jetty.security.HashUserRealm Set name=nameTest Realm/Set Set name=configSystemProperty name=jetty.home default=.//etc/realm.properties/Set /New /Item /Array /Set ... WebDefault.xml: !-- block by default. -- security-constraint web-resource-collection web-resource-nameDefault/web-resource-name url-pattern//url-pattern /web-resource-collection auth-constraint/ !-- BLOCK! -- /security-constraint !-- Setting admin access. -- security-constraint web-resource-collection web-resource-nameSolr authenticated application/web-resource-name url-pattern/admin/*/url-pattern url-pattern/core1/admin/*/url-pattern /web-resource-collection auth-constraint role-nameadmin/role-name role-nameFullAccess-role/role-name /auth-constraint /security-constraint !-- this constraint has no auth constraint or data constraint = allows without auth. -- security-constraint web-resource-collection web-resource-nameAllowedQueries/web-resource-name url-pattern/core1/select/*/url-pattern /web-resource-collection /security-constraint login-config auth-methodBASIC/auth-method realm-nameTest Realm/realm-name /login-config security-role role-nameAdmin-role/role-name /security-role security-role role-nameFullAccess-role/role-name /security-role security-role role-nameRW-role/role-name /security-role So Far Everything works good. I get a forbidden exception as soon as i try to commit documents in solr. but when i add the following security constraint tag in webdefault.xml, !-- this constraint allows access to modify the data in the SOLR service, with basic auth -- security-constraint web-resource-collection web-resource-nameRW/web-resource-name !-- the dataimport handler for each individual core -- url-pattern/core1/dataimport/url-pattern !-- the update handler (XML over HTTP) for each individual core -- url-pattern/core1/update/*/url-pattern /web-resource-collection auth-constraint !-- Roles of users are defined int the properties file -- !-- we allow users with rw-only access -- role-nameRW-role/role-name !-- we allow users with full access -- role-nameFullAccess-role/role-name /auth-constraint /security-constraint I get the following exception: org.apache.solr.client.solrj.SolrServerException: org.apache.commons.httpclient.ProtocolException: Unbuffered entity enclosing request can not be repeated. at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:469) at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:243) at org.apache.solr.client.solrj.request.AbstractUpdateRequest.process(AbstractUpdateRequest.java:105) at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:64) at Authentication.AuthenticationTest.main(AuthenticationTest.java:35) Caused by: org.apache.commons.httpclient.ProtocolException: Unbuffered entity enclosing request can not be repeated. at org.apache.commons.httpclient.methods.EntityEnclosingMethod.writeRequestBody(EntityEnclosingMethod.java:487) at org.apache.commons.httpclient.HttpMethodBase.writeRequest(HttpMethodBase.java:2114) at org.apache.commons.httpclient.HttpMethodBase.execute(HttpMethodBase.java:1096) at org.apache.commons.httpclient.HttpMethodDirector.executeWithRetry(HttpMethodDirector.java:398) at org.apache.commons.httpclient.HttpMethodDirector.executeMethod(HttpMethodDirector.java:171) at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:397) at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:323) at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:416) ... 4 more My Java code is as follows: public class AuthenticationTest { public static void main(String[] args) { try { HttpClient client = new HttpClient(); AuthScope scope = new AuthScope(AuthScope.ANY_HOST,AuthScope.ANY_PORT); client.getState().setCredentials
Re: Unbuffered Exception while setting permissions
This error usually occurs when i do a server.add(inpDoc). Behind the logs: 192.168.0.106 - - [30/Jun/2010:11:30:38 +] GET /solr/GPTWPI/update?qt=%2Fupdateoptimize=truewt=javabinversion=1 HTTP/1.1 200 41 192.168.0.106 - - [30/Jun/2010:11:30:38 +] GET /solr/GPTWPI/select?q=aid%3A30234wt=javabinversion=1 HTTP/1.1 401 1389 192.168.0.106 - admin [30/Jun/2010:11:30:38 +] GET /solr/GPTWPI/select?q=aid%3A30234wt=javabinversion=1 HTTP/1.1 200 70 192.168.0.106 - - [30/Jun/2010:11:30:38 +] POST /solr/GPTWPI/update?wt=javabinversion=1 HTTP/1.1 200 41 (Works when i comment out the auth-constraint for RW) AND 192.168.0.106 - - [30/Jun/2010:11:29:09 +] POST /solr/GPTWPI/update?wt=javabinversion=1 HTTP/1.1 401 1389 (Does not work when i add the auth-constraint for RW) 192.168.0.106 - - [30/Jun/2010:11:30:38 +] GET /solr/GPTWPI/update?qt=%2Fupdatecommit=truewt=javabinversion=1 HTTP/1.1 200 41 so what i conclude is that the authentication does not work when we do a POST method and works for GET methods. correct me if i am wrong. and how do i get it working? Regards, Raakhi On Wed, Jun 30, 2010 at 2:22 PM, Rakhi Khatwani rkhatw...@gmail.com wrote: I was going through the logs, Everytime i try doing an update (and ofcourse ending up with unbuffered exception) the log outputs the following line [30/Jun/2010:09:02:52 +] POST /solr/core1/update?wt=javabinversion=1 HTTP/1.1 401 1389 Regards Raakhi On Wed, Jun 30, 2010 at 12:27 PM, Rakhi Khatwani rkhatw...@gmail.comwrote: PS: I am using solr 1.4 Regards, Raakhi On Wed, Jun 30, 2010 at 12:05 PM, Rakhi Khatwani rkhatw...@gmail.comwrote: Hi, I am trying out solr security on my setup from the following links: http://wiki.apache.org/solr/SolrSecurity http://www.lucidimagination.com/search/document/d1e338dc452db2e4/how_can_i_protect_the_solr_cores Following is my configuration: realms.properties: admin: admin,server-administrator,content-administrator,admin other: OBF:1xmk1w261u9r1w1c1xmq guest: guest,read-only rakhi: rakhi,RW-role jetty.xml: ... Set name=UserRealms Array type=org.mortbay.jetty.security.UserRealm Item New class=org.mortbay.jetty.security.HashUserRealm Set name=nameTest Realm/Set Set name=configSystemProperty name=jetty.home default=.//etc/realm.properties/Set /New /Item /Array /Set ... WebDefault.xml: !-- block by default. -- security-constraint web-resource-collection web-resource-nameDefault/web-resource-name url-pattern//url-pattern /web-resource-collection auth-constraint/ !-- BLOCK! -- /security-constraint !-- Setting admin access. -- security-constraint web-resource-collection web-resource-nameSolr authenticated application/web-resource-name url-pattern/admin/*/url-pattern url-pattern/core1/admin/*/url-pattern /web-resource-collection auth-constraint role-nameadmin/role-name role-nameFullAccess-role/role-name /auth-constraint /security-constraint !-- this constraint has no auth constraint or data constraint = allows without auth. -- security-constraint web-resource-collection web-resource-nameAllowedQueries/web-resource-name url-pattern/core1/select/*/url-pattern /web-resource-collection /security-constraint login-config auth-methodBASIC/auth-method realm-nameTest Realm/realm-name /login-config security-role role-nameAdmin-role/role-name /security-role security-role role-nameFullAccess-role/role-name /security-role security-role role-nameRW-role/role-name /security-role So Far Everything works good. I get a forbidden exception as soon as i try to commit documents in solr. but when i add the following security constraint tag in webdefault.xml, !-- this constraint allows access to modify the data in the SOLR service, with basic auth -- security-constraint web-resource-collection web-resource-nameRW/web-resource-name !-- the dataimport handler for each individual core -- url-pattern/core1/dataimport/url-pattern !-- the update handler (XML over HTTP) for each individual core -- url-pattern/core1/update/*/url-pattern /web-resource-collection auth-constraint !-- Roles of users are defined int the properties file -- !-- we allow users with rw-only access -- role-nameRW-role/role-name !-- we allow users with full access -- role-nameFullAccess-role/role-name /auth-constraint /security-constraint I get the following exception: org.apache.solr.client.solrj.SolrServerException: org.apache.commons.httpclient.ProtocolException: Unbuffered entity enclosing request can not be repeated. at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:469) at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:243) at org.apache.solr.client.solrj.request.AbstractUpdateRequest.process(AbstractUpdateRequest.java:105
Re: Unbuffered Exception while setting permissions
Hi Lance, Thankyou so much. It worked with pre-emptive authentication On Thu, Jul 1, 2010 at 2:15 AM, Lance Norskog goks...@gmail.com wrote: Other problems with this error have been solved by doing pre-emptive authentication. On Wed, Jun 30, 2010 at 4:26 AM, Rakhi Khatwani rkhatw...@gmail.com wrote: This error usually occurs when i do a server.add(inpDoc). Behind the logs: 192.168.0.106 - - [30/Jun/2010:11:30:38 +] GET /solr/GPTWPI/update?qt=%2Fupdateoptimize=truewt=javabinversion=1 HTTP/1.1 200 41 192.168.0.106 - - [30/Jun/2010:11:30:38 +] GET /solr/GPTWPI/select?q=aid%3A30234wt=javabinversion=1 HTTP/1.1 401 1389 192.168.0.106 - admin [30/Jun/2010:11:30:38 +] GET /solr/GPTWPI/select?q=aid%3A30234wt=javabinversion=1 HTTP/1.1 200 70 192.168.0.106 - - [30/Jun/2010:11:30:38 +] POST /solr/GPTWPI/update?wt=javabinversion=1 HTTP/1.1 200 41 (Works when i comment out the auth-constraint for RW) AND 192.168.0.106 - - [30/Jun/2010:11:29:09 +] POST /solr/GPTWPI/update?wt=javabinversion=1 HTTP/1.1 401 1389 (Does not work when i add the auth-constraint for RW) 192.168.0.106 - - [30/Jun/2010:11:30:38 +] GET /solr/GPTWPI/update?qt=%2Fupdatecommit=truewt=javabinversion=1 HTTP/1.1 200 41 so what i conclude is that the authentication does not work when we do a POST method and works for GET methods. correct me if i am wrong. and how do i get it working? Regards, Raakhi On Wed, Jun 30, 2010 at 2:22 PM, Rakhi Khatwani rkhatw...@gmail.com wrote: I was going through the logs, Everytime i try doing an update (and ofcourse ending up with unbuffered exception) the log outputs the following line [30/Jun/2010:09:02:52 +] POST /solr/core1/update?wt=javabinversion=1 HTTP/1.1 401 1389 Regards Raakhi On Wed, Jun 30, 2010 at 12:27 PM, Rakhi Khatwani rkhatw...@gmail.com wrote: PS: I am using solr 1.4 Regards, Raakhi On Wed, Jun 30, 2010 at 12:05 PM, Rakhi Khatwani rkhatw...@gmail.com wrote: Hi, I am trying out solr security on my setup from the following links: http://wiki.apache.org/solr/SolrSecurity http://www.lucidimagination.com/search/document/d1e338dc452db2e4/how_can_i_protect_the_solr_cores Following is my configuration: realms.properties: admin: admin,server-administrator,content-administrator,admin other: OBF:1xmk1w261u9r1w1c1xmq guest: guest,read-only rakhi: rakhi,RW-role jetty.xml: ... Set name=UserRealms Array type=org.mortbay.jetty.security.UserRealm Item New class=org.mortbay.jetty.security.HashUserRealm Set name=nameTest Realm/Set Set name=configSystemProperty name=jetty.home default=.//etc/realm.properties/Set /New /Item /Array /Set ... WebDefault.xml: !-- block by default. -- security-constraint web-resource-collection web-resource-nameDefault/web-resource-name url-pattern//url-pattern /web-resource-collection auth-constraint/ !-- BLOCK! -- /security-constraint !-- Setting admin access. -- security-constraint web-resource-collection web-resource-nameSolr authenticated application/web-resource-name url-pattern/admin/*/url-pattern url-pattern/core1/admin/*/url-pattern /web-resource-collection auth-constraint role-nameadmin/role-name role-nameFullAccess-role/role-name /auth-constraint /security-constraint !-- this constraint has no auth constraint or data constraint = allows without auth. -- security-constraint web-resource-collection web-resource-nameAllowedQueries/web-resource-name url-pattern/core1/select/*/url-pattern /web-resource-collection /security-constraint login-config auth-methodBASIC/auth-method realm-nameTest Realm/realm-name /login-config security-role role-nameAdmin-role/role-name /security-role security-role role-nameFullAccess-role/role-name /security-role security-role role-nameRW-role/role-name /security-role So Far Everything works good. I get a forbidden exception as soon as i try to commit documents in solr. but when i add the following security constraint tag in webdefault.xml, !-- this constraint allows access to modify the data in the SOLR service, with basic auth -- security-constraint web-resource-collection web-resource-nameRW/web-resource-name !-- the dataimport handler for each individual core -- url-pattern/core1/dataimport/url-pattern !-- the update handler (XML over HTTP) for each individual core -- url-pattern/core1/update/*/url-pattern /web-resource-collection auth-constraint !-- Roles of users are defined int the properties file -- !-- we allow users with rw-only access -- role-nameRW-role/role-name !-- we allow users with full access -- role-nameFullAccess-role/role-name /auth-constraint /security-constraint I get the following
Solr Cloud/ Solr integration with zookeeper
Hi, I wanna use solr cloud. i downloaded the code from the trunk, and successfully executed the examples as shown in wiki. but when i try the same with multicore. i cannot access: http://localhost:8983/solr/collection1/admin/zookeeper.jsp it says page not found. Following is my configuration of solr.xml in multicore directory: solr persistent=false !-- adminPath: RequestHandler path to manage cores. If 'null' (or absent), cores will not be manageable via request handler -- cores adminPath=/admin/cores defaultCoreName=collection1 core name=GPTWPI instanceDir=GPTWPI shard=shard1/ core name=BritGasAI instanceDir=BritGasAI shard=shard1 / !-- SolrCloud related attributes on core: collection - The name of the collection this core belongs to. Defaults to the name of the core. shard - which shard of the collection this core has. Defaults to a name derived from host:port_webapp_corename. role - TBD -- /cores i have created zoo.cfg at the multicore directory with the following configuration: # The number of milliseconds of each tick tickTime=2000 # The number of ticks that the initial # synchronization phase can take initLimit=10 # The number of ticks that can pass between # sending a request and getting an acknowledgement syncLimit=5 # the directory where the snapshot is stored. # dataDir=/opt/zookeeper/data # NOTE: Solr defaults the dataDir to solrHome/zoo_data # the port at which the clients will connect # clientPort=2181 # NOTE: Solr sets this based on zkRun / zkHost params The command i run is: java -Dsolr.solr.home=multicore -Dcollection.configName=myconf1 -DzkRun -jar start.jar Am i going wrong anywhere? Alternatively can we run an external zookeeper server which can communicate with solr servers(as clients) and show the status of each solr server. Regards, Raakhi
Re: Field Collapsing SOLR-236
Hi Mozzam, I finally got it working Thanks a ton guys :) Regards Raakhi On Sat, Jul 10, 2010 at 10:45 AM, Moazzam Khan moazz...@gmail.com wrote: Hi Rakhi, Sorry, I didn't see this email until just now. Did you get it working? If not here's some things that might help. - Download the patch first. - Check the date on which the patch was released. - Download the version of the trunk that existed at that date. - Apply the patch using the patch program in linux. There is a Windows program for patching but I can't remember right now. - After applying the patch just compile the whole thing It might be better if you used the example folder first and modify the config to work for multicore (at least that's what I did) . You can compile example by doing ant example (if I remember correctly) For config stuff refer to this link : http://wiki.apache.org/solr/FieldCollapsing HTH :) - Moazzam I'd give you the On Wed, Jun 23, 2010 at 7:23 AM, Rakhi Khatwani rkhatw...@gmail.com wrote: Hi, But these is almost no settings in my config heres a snapshot of what i have in my solrconfig.xml config updateHandler class=solr.DirectUpdateHandler2 / requestDispatcher handleSelect=true requestParsers enableRemoteStreaming=false multipartUploadLimitInKB=2048 / /requestDispatcher requestHandler name=standard class=solr.StandardRequestHandler default=true / requestHandler name=/update class=solr.XmlUpdateRequestHandler / requestHandler name=/admin/ class=org.apache.solr.handler.admin.AdminHandlers / !-- config for the admin interface -- admin defaultQuery*:*/defaultQuery /admin !-- config for field collapsing -- searchComponent name=query class=org.apache.solr.handler.component.CollapseComponent / /config Am i goin wrong anywhere? Regards, Raakhi On Wed, Jun 23, 2010 at 3:28 PM, Govind Kanshi govind.kan...@gmail.com wrote: fieldType:analyzer without class or tokenizer filter list seems to point to the config - you may want to correct. On Wed, Jun 23, 2010 at 3:09 PM, Rakhi Khatwani rkhatw...@gmail.com wrote: Hi, I checked out modules lucene from the trunk. Performed a build using the following commands ant clean ant compile ant example Which compiled successfully. I then put my existing index(using schema.xml from solr1.4.0/conf/solr/) in the multicore folder, configured solr.xml and started the server When i type in http://localhost:8983/solr i get the following error: org.apache.solr.common.SolrException: Plugin init failure for [schema.xml] fieldType:analyzer without class or tokenizer filter list at org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:168) at org.apache.solr.schema.IndexSchema.readSchema(IndexSchema.java:480) at org.apache.solr.schema.IndexSchema.init(IndexSchema.java:122) at org.apache.solr.core.CoreContainer.create(CoreContainer.java:429) at org.apache.solr.core.CoreContainer.load(CoreContainer.java:286) at org.apache.solr.core.CoreContainer.load(CoreContainer.java:198) at org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:123) at org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:86) at org.mortbay.jetty.servlet.FilterHolder.doStart(FilterHolder.java:97) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50) at org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:662) at org.mortbay.jetty.servlet.Context.startContext(Context.java:140) at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1250) at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:517) at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:467) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:152) at org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:156) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:152) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50) at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130) at org.mortbay.jetty.Server.doStart(Server.java:224) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50) at org.mortbay.xml.XmlConfiguration.main(XmlConfiguration.java:985) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39
Using solr response json
Hi, i want to query solr and convert my response object to a json string using solrj when i query from my browser(with wt=json) i get the following result: { responseHeader:{ status:0, QTime:0}, response:{numFound:0,start:0,docs:[] }} At the moment i am using google-gson (a third party api) to directly convert an object into a json string but somehow when i try converting a QueryResponse object into a json string i get: {_header:{nvPairs:[status,0,QTime,1]},_results:[],elapsedTime:121,response:{nvPairs:[responseHeader,{nvPairs:[status,0,QTime,1]},response,[]]}} Any pointers? Regards Raakhi.
Re: issue in adding data to a multivalued field
Hi Koji, thanx a ton ... now it worked :) On Fri, Oct 9, 2009 at 6:02 AM, Koji Sekiguchi k...@r.email.ne.jp wrote: Hi Rakhi, Use multiValued (capital V), not multivalued. :) Koji Rakhi Khatwani wrote: Hi, i have a small schema with some of the fields defined as: field name=id type=string indexed=true stored=true multiValued=false required=true/ field name=content type=text indexed=true stored=true multivalued=false / field name=author_name type=text indexed=true stored=false multivalued=true/ where the field author_name is multivalued. however in UI (schema browser), following r the details of author_name field, its nowhere mentioned tht its multivalued. Field: author_name Field Type: text Properties: Indexed, Tokenized when i try creating and adding a document into solr, i get an exception ERROR_id1_multiple_values_encountered_for_non_multiValued_field_author_name_ninad_raakhi_goureya_sheetal here's my code snippet: solrDoc17.addField(id, id1); solrDoc17.addField(content, SOLR); solrDoc17.addField(author_name, ninad); solrDoc17.addField(author_name, raakhi); solrDoc17.addField(author_name, goureya); solrDoc17.addField(author_name, sheetal); server.add(solrDoc17); server.commit(); ny pointers?? regards, Raakhi
Re: using regular expressions in solr query
Hi, well i have a schema where i store a json string. consider a small example schema shown below: doc - arr name=buzz - str {word:words,baseWord:word,pos:noun,phrase:following are the list of words,frequency:7} /str - str {word:heroes,baseWord:hero,pos:noun,phrase:have you watched the movie heroes?,frequency:2} /str - str {word:khatto,baseWord:khatwani,pos:noun,phrase:khatto is a good girl,frequency:2} /str /arr str name=contentcontent /str str name=id1/str double name=score0.5743896923934756/double - arr name=topic strtopic1/str strtopic2/str strtopic3/str strtopic4/str strtopic5/str /arr /doc Now i wanna run a range query on frequency or a wild card query on baseword QUERY: buzz:baseWord:khat* = gives an exception QUERY: buzz:baseWord:khat* = gives no result QUERY:buzz:frequency:[2 TO 10] =gives no result QUERY: buzz:baseWord*khatwani = gives the result (example above) QUERY:buzz:baseWord*khat* = gives no results. ny pointers? Regards Raakhi On Tue, Oct 6, 2009 at 7:15 PM, Feak, Todd todd.f...@smss.sony.com wrote: Any particular reason for the double quotes in the 2nd and 3rd query example, but not the 1st, or is this just an artifact of your email? -Todd -Original Message- From: Rakhi Khatwani [mailto:rkhatw...@gmail.com] Sent: Tuesday, October 06, 2009 2:26 AM To: solr-user@lucene.apache.org Subject: using regular expressions in solr query Hi, i have an example in which i want to use a regular expression in my solr query: for example: suppose i wanna search on a sample : str name=contentraakhi rajnish ninad goureya sheetal/str str name=contentritesh rajnish ninad goureya sheetal/str where my content field is of type text when i type in QUERY: content:raa* RESPONSE : str name=contentraakhi rajnish ninad goureya sheetal/str QUERY: content:ra* RESPONSE: 0 results coz of this i am facing problems with the next query: QUERY: content: r* rajnish RESPONSE: 0 results which should ideally return both the results. any pointers?? Regards, Raakhi
Re: using regular expressions in solr query
Hi, some more queries: QUERY: buzz:base* = returns the desired result. QUERY: buzz:baseWord = returns the desired result. QUERY: buzz:baseWord* = returns nothing. QUERY: buzz:base*khatwani = returns nothing QUERY: buzz:base*khat* = returns nothing finding it kinda weird... ny pointers? Regards, Raakhi On Fri, Oct 9, 2009 at 3:50 PM, Rakhi Khatwani rkhatw...@gmail.com wrote: Hi, well i have a schema where i store a json string. consider a small example schema shown below: doc - arr name=buzz - str {word:words,baseWord:word,pos:noun,phrase:following are the list of words,frequency:7} /str - str {word:heroes,baseWord:hero,pos:noun,phrase:have you watched the movie heroes?,frequency:2} /str - str {word:khatto,baseWord:khatwani,pos:noun,phrase:khatto is a good girl,frequency:2} /str /arr str name=contentcontent /str str name=id1/str double name=score0.5743896923934756/double - arr name=topic strtopic1/str strtopic2/str strtopic3/str strtopic4/str strtopic5/str /arr /doc Now i wanna run a range query on frequency or a wild card query on baseword QUERY: buzz:baseWord:khat* = gives an exception QUERY: buzz:baseWord:khat* = gives no result QUERY:buzz:frequency:[2 TO 10] =gives no result QUERY: buzz:baseWord*khatwani = gives the result (example above) QUERY:buzz:baseWord*khat* = gives no results. ny pointers? Regards Raakhi On Tue, Oct 6, 2009 at 7:15 PM, Feak, Todd todd.f...@smss.sony.comwrote: Any particular reason for the double quotes in the 2nd and 3rd query example, but not the 1st, or is this just an artifact of your email? -Todd -Original Message- From: Rakhi Khatwani [mailto:rkhatw...@gmail.com] Sent: Tuesday, October 06, 2009 2:26 AM To: solr-user@lucene.apache.org Subject: using regular expressions in solr query Hi, i have an example in which i want to use a regular expression in my solr query: for example: suppose i wanna search on a sample : str name=contentraakhi rajnish ninad goureya sheetal/str str name=contentritesh rajnish ninad goureya sheetal/str where my content field is of type text when i type in QUERY: content:raa* RESPONSE : str name=contentraakhi rajnish ninad goureya sheetal/str QUERY: content:ra* RESPONSE: 0 results coz of this i am facing problems with the next query: QUERY: content: r* rajnish RESPONSE: 0 results which should ideally return both the results. any pointers?? Regards, Raakhi
weird behaviour while inserting records into solr
Hi, i was trying to insert one million records in solr (keeping the id from 0 to 100). things were fine till it inserted (id = 523932). after that it started inserting it from 1 (i.e updating). i am not able to understand this behaviour. any pointers?? Regards, Raakhi
Re: weird behaviour while inserting records into solr
HI, am inserting the records one by one. 1st i create a solr input document, add it into solr, perform a commit. i loop this entire process for a million times. Regards, Raakhi On Wed, Oct 28, 2009 at 1:45 AM, Grant Ingersoll gsing...@apache.orgwrote: On Oct 26, 2009, at 1:14 AM, Rakhi Khatwani wrote: Hi, i was trying to insert one million records in solr (keeping the id from 0 to 100). things were fine till it inserted (id = 523932). after that it started inserting it from 1 (i.e updating). i am not able to understand this behaviour. any pointers?? That seems pretty random. How are you inserting records? Regards, Raakhi -- Grant Ingersoll http://www.lucidimagination.com/ Search the Lucene ecosystem (Lucene/Solr/Nutch/Mahout/Tika/Droids) using Solr/Lucene: http://www.lucidimagination.com/search
Representing a complex schema in solr
Hi, i have a complex schema as shown below: Book - Title - Category - Publication - Edition - Publish Date - Author (multivalued) = Author is a multivalued field containing the following attributes. - Name - Age - Location - Gender - Qualification i wanna store the above information in solr so that i can query in every aspect one small query example would be: 1. search for all the books written by females. 2. search for all books writen by young authors...for example between the age 22 to 30. i woudn't wanna use RDBMS coz i have more than one million documents like this. i also tried saving the author as a JSON string. but then i cannot use wild card and range queries on it. any suggessions how wud i represent something like this in solr?? Regards, Raakhi
Solr Queries
Hi, I am using solr 1.3 and i hv inserted some data in my comment field. for example: for document1: str name=comment The iPhone 3GS finally adds common cell phone features like multimedia messaging, video recording, and voice dialing. It runs faster; its promised battery life is longer; and the multimedia quality continues to shine. The iPhone 3GS' call quality shows no improvements and the 3G signal reception remains uneven. We still don't get Flash Lite, USB transfer and storage, or multitasking. /str for document2: str name=comment Sony Ericsson c510 has 3.2MP cybershot camera with smile detectino. Amazing phone, faster than Sony Ericsson w580iSony Ericcsion w580i camera is only 2MP with no autofocus and smile detection. it doesnot even have a flash leading to poor quality pictures /str A] now when i apply the following queries, i get 0 hits: 1.comment:iph*e 2.comment:iph?ne B] Can i apply range queries on part of the content? C] Can i apply more the one wildcard in a query?? for example comment:ip*h* (this command works but its equivalent to just using 1ipho*) D] for fuzzy queries: content:iphone~0.7 returns both the documents. content:iphone~0.8 returns no documents (similarly for 0.9). However if i change it to iPhone, content:iPhone~0.7 returns 0 documents content:iPhone~0.5 returns both the documents. Is fuzzy search case sensitive? even if it is, why am i not able to retrieve unexpected results. Regards, Raakhi
Re: Solr Queries
Hi, Sorry i forgot to mention that comment field is a text field. Regards, Raakhi On Thu, Nov 12, 2009 at 8:05 PM, Grant Ingersoll gsing...@apache.orgwrote: On Nov 12, 2009, at 8:55 AM, Rakhi Khatwani wrote: Hi, I am using solr 1.3 and i hv inserted some data in my comment field. for example: for document1: str name=comment The iPhone 3GS finally adds common cell phone features like multimedia messaging, video recording, and voice dialing. It runs faster; its promised battery life is longer; and the multimedia quality continues to shine. The iPhone 3GS' call quality shows no improvements and the 3G signal reception remains uneven. We still don't get Flash Lite, USB transfer and storage, or multitasking. /str for document2: str name=comment Sony Ericsson c510 has 3.2MP cybershot camera with smile detectino. Amazing phone, faster than Sony Ericsson w580iSony Ericcsion w580i camera is only 2MP with no autofocus and smile detection. it doesnot even have a flash leading to poor quality pictures /str A] now when i apply the following queries, i get 0 hits: 1.comment:iph*e 2.comment:iph?ne What field type are you using? This is in your schema.xml B] Can i apply range queries on part of the content? C] Can i apply more the one wildcard in a query?? for example comment:ip*h* (this command works but its equivalent to just using 1ipho*) Yes. D] for fuzzy queries: content:iphone~0.7 returns both the documents. content:iphone~0.8 returns no documents (similarly for 0.9). The fuzz factor there incorporates the edit distance. I gather the first Sony doc has a match on phone and the score is between 0.7 and 0.8. You can add debugQuery=true to see the explains. However if i change it to iPhone, content:iPhone~0.7 returns 0 documents content:iPhone~0.5 returns both the documents. Is fuzzy search case sensitive? even if it is, why am i not able to retrieve unexpected results. Again, this all comes back to how you analyze the documents based on what Field Type you are using? -- Grant Ingersoll http://www.lucidimagination.com/ Search the Lucene ecosystem (Lucene/Solr/Nutch/Mahout/Tika/Droids) using Solr/Lucene: http://www.lucidimagination.com/search
locks in solr
Hi, Is there any article which explains the locks in solr?? there is some info on solrconfig.txt which says that you can set the lock type to none(NoLockFactory), single(SingleInstanceLockFactory), NativeFSLockFactory and simple(SimpleFSLockFactory) which locks everytime we create a new file. suppose my index dir has the following files: _2s.fdt, _2t.fnm, _2u.nrm, _2v.tii, _2x.fdt, _2y.fnm, _2z.nrm, _30.tii, _2s.fdx, _2t.frq, _2u.prx, _2v.tis _2x.fdx, _2y.frq _2z.prx, _30.tis, _2s.fnm, _2t.nrm, _2u.tii, _2w.fdt _2x.fnm, _2y.nrm _2z.tii, segments_2s, _2s.frq, _2t.prx, _2u.tis, _2w.fdx _2x.frq, _2y.prx _2z.tis, segments.gen 1.) I assume for each of these files there is a lock. please correct me if i am wrong. 2.) what are the different lock types in terms of read/write/updates? 3.) Can we have a document level locking scheme? 4.) we would like to know the best way to handle multiple simulataneous writes to the index Thanks a ton, Raakhi
Converting java date to solr date and querying dates
Hi, i am using solrj i want to store dates into a date field called publish date in solr. how do we do it using solrj likewise how do we query from solr using java date? do we always have to convert it into UTC field and then query it? How do i query solr for documents published on monday or for documents published on March etc. or in that case even apply range queries on it?? Regards Raakhi
Solr Queries
Hi, Suppose i have a content field of type text. an example on content field is as shown below: After frustrated waiting period to get my credit card from the ICICI Bank, today I decided to write them a online petition stating my problem... Below is the unedited version of letter I sent to ICICI... 1. Can i use promixity search for 2 phrases frustrated waiting and credit card?? (i wanted to perform a search checking if frustrated waiting and credit card are within 10 words using proximity search.) where frustrated waiting and credit card are exact phrases (i.e. can i search on it as a whole word.. n not 2 different words in different parts in a document) does solr support this kinda operation. if so how do we structure our query. or could you gimme an example? Thanks Raakhi Khatwani.
Starting Jetty Server using JettySolrRunner
Hi, I am trying 2 run a solr server using JettySolrRunner, however i keep gettin the following exception: Can't find resource 'solrconfig.xml' in classpath or 'solr/conf/', cwd=/home/ithurs/shellworkspace/SolrPOC at org.apache.solr.core.SolrResourceLoader.openResource(SolrResourceLoader.java:260) at org.apache.solr.core.SolrResourceLoader.openConfig(SolrResourceLoader.java:228) at org.apache.solr.core.Config.init(Config.java:101) at org.apache.solr.core.SolrConfig.init(SolrConfig.java:130) at org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:134) at org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:83) at org.mortbay.jetty.servlet.FilterHolder.doStart(FilterHolder.java:99) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:594) at org.mortbay.jetty.servlet.Context.startContext(Context.java:139) at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:500) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:117) at org.mortbay.jetty.Server.doStart(Server.java:210) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.apache.solr.client.solrj.embedded.JettySolrRunner.start(JettySolrRunner.java:99) at org.apache.solr.client.solrj.embedded.JettySolrRunner.start(JettySolrRunner.java:93) at com.germinait.solr.jetty.StartStopJetty.main(StartStopJetty.java:9) Jan 27, 2010 4:48:56 PM org.apache.solr.core.CoreContainer finalize SEVERE: CoreContainer was not shutdown prior to finalize(), indicates a bug -- POSSIBLE RESOURCE LEAK!!! Jan 27, 2010 4:48:56 PM org.apache.solr.common.SolrException log SEVERE: java.lang.RuntimeException: Can't find resource 'solrconfig.xml' in classpath or 'solr/conf/', cwd=/home/ithurs/shellworkspace/SolrPOC at org.apache.solr.core.SolrResourceLoader.openResource(SolrResourceLoader.java:260) at org.apache.solr.core.SolrResourceLoader.openConfig(SolrResourceLoader.java:228) at org.apache.solr.core.Config.init(Config.java:101) at org.apache.solr.core.SolrConfig.init(SolrConfig.java:130) at org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:134) at org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:83) at org.mortbay.jetty.servlet.FilterHolder.doStart(FilterHolder.java:99) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:594) at org.mortbay.jetty.servlet.Context.startContext(Context.java:139) at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:500) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:117) at org.mortbay.jetty.Server.doStart(Server.java:210) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.apache.solr.client.solrj.embedded.JettySolrRunner.start(JettySolrRunner.java:99) at org.apache.solr.client.solrj.embedded.JettySolrRunner.start(JettySolrRunner.java:93) at com.germinait.solr.jetty.StartStopJetty.main(StartStopJetty.java:9) Jan 27, 2010 4:48:56 PM org.apache.solr.servlet.SolrDispatchFilter init INFO: SolrDispatchFilter.init() done Jan 27, 2010 4:48:56 PM sun.reflect.NativeMethodAccessorImpl invoke0 WARNING: failed SocketConnector @ 0.0.0.0:8983 java.net.BindException: Address already in use at java.net.PlainSocketImpl.socketBind(Native Method) at java.net.PlainSocketImpl.bind(PlainSocketImpl.java:359) at java.net.ServerSocket.bind(ServerSocket.java:319) at java.net.ServerSocket.init(ServerSocket.java:185) at java.net.ServerSocket.init(ServerSocket.java:141) at org.mortbay.jetty.bio.SocketConnector.newServerSocket(SocketConnector.java:78) at org.mortbay.jetty.bio.SocketConnector.open(SocketConnector.java:72) at org.mortbay.jetty.AbstractConnector.doStart(AbstractConnector.java:252) at org.mortbay.jetty.bio.SocketConnector.doStart(SocketConnector.java:145) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.Server.doStart(Server.java:221) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.apache.solr.client.solrj.embedded.JettySolrRunner.start(JettySolrRunner.java:99) at org.apache.solr.client.solrj.embedded.JettySolrRunner.start(JettySolrRunner.java:93) at com.germinait.solr.jetty.StartStopJetty.main(StartStopJetty.java:9) Jan 27, 2010 4:48:56 PM sun.reflect.NativeMethodAccessorImpl invoke0 is there any way to specify the current working directory? and wht if we hv multicore with several cores, each core has a solrconfig.xml in the conf folder. how would we start a jetty server from the API in that case? Regards, Raakhi Khatwani
Upgrading from solr1.3 to solr1.4
Hi, i have indexed some data on solr 1.3.0. Now i wanna upgrade to solr 1.4.0 but on the same data. so here are the following steps i performed: 1. extract solr 1.4.0 2. copied the conf and data folder of my index from solr 1.3.0/examples/multicore to solr1.4.0/examples/multicore/ 3. started the multicore server. but my queries on the index takes longer than with solr 1.3.0 any suggesstions?? Regards, Raakhi
Re: Upgrading from solr1.3 to solr1.4
Hi, Solr home: 1.3.0/examples/multicore Type of Queries: Recursive e.g. I search in the index for some name that returns some rows. For each row there is a field called parentid which is a unique key for some other row in the index. The next queries search the index for the parentid . This continues till parentid is 1. I am using solrj api to run the queries and the time is measured using java API (System.currentTimeMillis()). An example when queried on Solr 1.3 takes less than 1 sec whereas the Solr 1.4 takes 10 sec. For migration i have copied the core folder (which contains conf and data folders) from 1.3 setup to 1.4 setup. Is there anything else to be done for the migration? Regards, Raakhi On Wed, Feb 17, 2010 at 7:54 AM, Chris Hostetter hossman_luc...@fucit.orgwrote: :i have indexed some data on solr 1.3.0. Now i wanna upgrade to solr : 1.4.0 but on the same data. : so here are the following steps i performed: : 1. extract solr 1.4.0 : 2. copied the conf and data folder of my index from solr : 1.3.0/examples/multicore to solr1.4.0/examples/multicore/ : 3. started the multicore server. : but my queries on the index takes longer than with solr 1.3.0 : any suggesstions?? I assume you were using 1.3.0/examples/multicore as your solr home in 1.3? (it seems that way, but i want to be sure) can you elaborate on what you mean by my queries on the index takes longer ... specifics would be very helpful: ie: what types of queries?, how long do they take?, how are you measuring? -Hoss
Want to create custom inputformat to read from solr
Hi, Has anyone tried creating customInputFormat which reads from solrIndex for processing using mapreduce??? is it possible doin tht?? and how? Regards, Raakhi
solr distributed search example - exception
Hi, I was executing a simple example which demonstrates DistributedSearch. example provided in the following link: http://wiki.apache.org/solr/DistributedSearch however, when i startup the server in both port nos: 8983 and 7574, i get the following exception: SEVERE: Could not start SOLR. Check solr/home property java.lang.ClassCastException: java.util.ArrayList cannot be cast to org.w3c.dom.NodeList at org.apache.solr.search.CacheConfig.getMultipleConfigs(CacheConfig.java:61) at org.apache.solr.core.SolrConfig.init(SolrConfig.java:131) at org.apache.solr.core.SolrConfig.init(SolrConfig.java:70) at org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:117) at org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:69) at org.mortbay.jetty.servlet.FilterHolder.doStart(FilterHolder.java:99) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:594) at org.mortbay.jetty.servlet.Context.startContext(Context.java:139) at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1218) at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:500) at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:448) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:147) at org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:161) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:147) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:117) at org.mortbay.jetty.Server.doStart(Server.java:210) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.xml.XmlConfiguration.main(XmlConfiguration.java:929) at java.lang.reflect.Method.invoke(libgcj.so.7rh) at org.mortbay.start.Main.invokeMain(Main.java:183) at org.mortbay.start.Main.start(Main.java:497) at org.mortbay.start.Main.main(Main.java:115) 2009-06-08 18:36:28.016::WARN: failed SolrRequestFilter java.lang.NoClassDefFoundError: org.apache.solr.core.SolrCore at java.lang.Class.initializeClass(libgcj.so.7rh) at org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:77) at org.mortbay.jetty.servlet.FilterHolder.doStart(FilterHolder.java:99) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:594) at org.mortbay.jetty.servlet.Context.startContext(Context.java:139) at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1218) at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:500) at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:448) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:147) at org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:161) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:147) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:117) at org.mortbay.jetty.Server.doStart(Server.java:210) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.xml.XmlConfiguration.main(XmlConfiguration.java:929) at java.lang.reflect.Method.invoke(libgcj.so.7rh) at org.mortbay.start.Main.invokeMain(Main.java:183) at org.mortbay.start.Main.start(Main.java:497) at org.mortbay.start.Main.main(Main.java:115) Caused by: java.lang.ClassNotFoundException: org.apache.solr.core.JmxMonitoredMap not found in StartLoader[file:/home/ithurs/apache-solr-1.3.0/example7574/, file:/home/ithurs/apache-solr-1.3.0/example7574/lib/jetty-6.1.3.jar, file:/home/ithurs/apache-solr-1.3.0/example7574/lib/jetty-util-6.1.3.jar, file:/home/ithurs/apache-solr-1.3.0/example7574/lib/servlet-api-2.5-6.1.3.jar] at java.net.URLClassLoader.findClass(libgcj.so.7rh) at java.lang.ClassLoader.loadClass(libgcj.so.7rh) at java.lang.ClassLoader.loadClass(libgcj.so.7rh) at org.mortbay.jetty.webapp.WebAppClassLoader.loadClass(WebAppClassLoader.java:375) at org.mortbay.jetty.webapp.WebAppClassLoader.loadClass(WebAppClassLoader.java:337) at java.lang.Class.forName(libgcj.so.7rh) at java.lang.Class.initializeClass(libgcj.so.7rh) ...22 more 2009-06-08
Re: solr distributed search example - exception
Hi Mark, yea i would like to open a JIRA issue for it. how do i go about that? Regards, Raakhi On Mon, Jun 8, 2009 at 7:58 PM, Mark Miller markrmil...@gmail.com wrote: That is a very odd cast exception to get. Do you want to open a JIRA issue for this? It looks like an odd exception because the call is: NodeList nodes = (NodeList)solrConfig.evaluate(configPath, XPathConstants.NODESET); // cast exception is we get an ArrayList rather than NodeList Which leads to: Object o = xpath.evaluate(xstr, doc, type); where type = XPathConstants.NODESET So you get back an Object based on the XPathConstant passed. There does not appear to be a value that would return an ArrayList. Using XPathConstants.NODESET gets you a NodeList according to the XPath API. I'm not sure what could cause this to happen. - Mark Rakhi Khatwani wrote: Hi, I was executing a simple example which demonstrates DistributedSearch. example provided in the following link: http://wiki.apache.org/solr/DistributedSearch however, when i startup the server in both port nos: 8983 and 7574, i get the following exception: SEVERE: Could not start SOLR. Check solr/home property java.lang.ClassCastException: java.util.ArrayList cannot be cast to org.w3c.dom.NodeList at org.apache.solr.search.CacheConfig.getMultipleConfigs(CacheConfig.java:61) at org.apache.solr.core.SolrConfig.init(SolrConfig.java:131) at org.apache.solr.core.SolrConfig.init(SolrConfig.java:70) at org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:117) at org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:69) at org.mortbay.jetty.servlet.FilterHolder.doStart(FilterHolder.java:99) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:594) at org.mortbay.jetty.servlet.Context.startContext(Context.java:139) at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1218) at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:500) at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:448) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:147) at org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:161) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:147) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:117) at org.mortbay.jetty.Server.doStart(Server.java:210) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.xml.XmlConfiguration.main(XmlConfiguration.java:929) at java.lang.reflect.Method.invoke(libgcj.so.7rh) at org.mortbay.start.Main.invokeMain(Main.java:183) at org.mortbay.start.Main.start(Main.java:497) at org.mortbay.start.Main.main(Main.java:115) 2009-06-08 18:36:28.016::WARN: failed SolrRequestFilter java.lang.NoClassDefFoundError: org.apache.solr.core.SolrCore at java.lang.Class.initializeClass(libgcj.so.7rh) at org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:77) at org.mortbay.jetty.servlet.FilterHolder.doStart(FilterHolder.java:99) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:594) at org.mortbay.jetty.servlet.Context.startContext(Context.java:139) at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1218) at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:500) at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:448) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:147) at org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:161) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:147) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:117) at org.mortbay.jetty.Server.doStart(Server.java:210) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.xml.XmlConfiguration.main(XmlConfiguration.java:929) at java.lang.reflect.Method.invoke(libgcj.so.7rh) at org.mortbay.start.Main.invokeMain(Main.java:183
Re: solr distributed search example - exception
Hi Mark, i actually got this error coz i was using an old version of java. now the problem is solved Thanks anyways Raakhi On Tue, Jun 9, 2009 at 11:17 AM, Rakhi Khatwani rkhatw...@gmail.com wrote: Hi Mark, yea i would like to open a JIRA issue for it. how do i go about that? Regards, Raakhi On Mon, Jun 8, 2009 at 7:58 PM, Mark Miller markrmil...@gmail.com wrote: That is a very odd cast exception to get. Do you want to open a JIRA issue for this? It looks like an odd exception because the call is: NodeList nodes = (NodeList)solrConfig.evaluate(configPath, XPathConstants.NODESET); // cast exception is we get an ArrayList rather than NodeList Which leads to: Object o = xpath.evaluate(xstr, doc, type); where type = XPathConstants.NODESET So you get back an Object based on the XPathConstant passed. There does not appear to be a value that would return an ArrayList. Using XPathConstants.NODESET gets you a NodeList according to the XPath API. I'm not sure what could cause this to happen. - Mark Rakhi Khatwani wrote: Hi, I was executing a simple example which demonstrates DistributedSearch. example provided in the following link: http://wiki.apache.org/solr/DistributedSearch however, when i startup the server in both port nos: 8983 and 7574, i get the following exception: SEVERE: Could not start SOLR. Check solr/home property java.lang.ClassCastException: java.util.ArrayList cannot be cast to org.w3c.dom.NodeList at org.apache.solr.search.CacheConfig.getMultipleConfigs(CacheConfig.java:61) at org.apache.solr.core.SolrConfig.init(SolrConfig.java:131) at org.apache.solr.core.SolrConfig.init(SolrConfig.java:70) at org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:117) at org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:69) at org.mortbay.jetty.servlet.FilterHolder.doStart(FilterHolder.java:99) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:594) at org.mortbay.jetty.servlet.Context.startContext(Context.java:139) at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1218) at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:500) at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:448) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:147) at org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:161) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:147) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:117) at org.mortbay.jetty.Server.doStart(Server.java:210) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.xml.XmlConfiguration.main(XmlConfiguration.java:929) at java.lang.reflect.Method.invoke(libgcj.so.7rh) at org.mortbay.start.Main.invokeMain(Main.java:183) at org.mortbay.start.Main.start(Main.java:497) at org.mortbay.start.Main.main(Main.java:115) 2009-06-08 18:36:28.016::WARN: failed SolrRequestFilter java.lang.NoClassDefFoundError: org.apache.solr.core.SolrCore at java.lang.Class.initializeClass(libgcj.so.7rh) at org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:77) at org.mortbay.jetty.servlet.FilterHolder.doStart(FilterHolder.java:99) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:594) at org.mortbay.jetty.servlet.Context.startContext(Context.java:139) at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1218) at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:500) at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:448) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:147) at org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:161) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:147) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:117) at org.mortbay.jetty.Server.doStart(Server.java:210) at org.mortbay.component.AbstractLifeCycle.start
solr in distributed mode
Hi, I was looking for ways in which we can use solr in distributed mode. is there anyways we can use solr indexes across machines or by using Hadoop Distributed File System? Its has been mentioned in the wiki that When an index becomes too large to fit on a single system, or when a single query takes too long to execute, an index can be split into multiple shards, and Solr can query and merge results across those shards. what i understand is that shards are a partition. are shards on the same machine or can it be on different machines?? do we have to manually split the indexes to store in different shards. do you have an example or some tutorial which demonstrates distributed index searching/ storing using shards? Regards, Raakhi
Re: solr in distributed mode
Hi, i went through the document: http://www.lucidimagination.com/Community/Hear-from-the-Experts/Articles/Scaling-Lucene-and-Solr i have a couple of questions: 1. In the document its been mentioned that There will be a 'master' server for each shard and then 1-n 'slaves' that are replicated from the master. how is the replication process done? suppose i have 2 machines nodeA and nodeB I edited scripts.config in solr/conf of both nodeA and nodeB to point to the master (i.e. nodeA). i) is it the right approach for setting up master/slave configuration? ii) to start the master/slave config, should i execute start.jar from both the nodes? or just from the master node? iii) are indexes automatically replicated when you insert/update it in the master.. or do we have to run a script for that? iv) how do i know if replication process is sucessfully carried out. v) suppose the master goes down. i do i perform a node failover.. for example make one of the slaves as master without disrupting my application? 2. It has also been mentioned that: With distribution and replication, none of the master shards know about each other. You index to each master, the index is replicated to each slave, and then searches are distributed across the slaves, using one slave from each master/slave shard. i) Are slaves used only for index replications? i mean can't i have indexes distributed across slaves so that when i perform a search, it searches across all slaves? ii) since none of the shards have any information about one another, if i update/delete the document based on term, how does the index gets updated across all shards? or do we have to merge, update/delete and then distribute it across shards? Regards, Rakahi In a distributed configuration, one server 'shard' will get a query request and then search itself, as well as the other shards in the configuration, and return the combined results from each shard. On Wed, Jun 10, 2009 at 11:23 AM, Otis Gospodnetic otis_gospodne...@yahoo.com wrote: Hello, All of this is covered on the Wiki, search for: distributed search Otis -- Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch - Original Message From: Rakhi Khatwani rkhatw...@gmail.com To: solr-user@lucene.apache.org Cc: ninad.r...@germinait.com; ranjit.n...@germinait.com; saurabh.maha...@germinait.com Sent: Tuesday, June 9, 2009 4:55:55 AM Subject: solr in distributed mode Hi, I was looking for ways in which we can use solr in distributed mode. is there anyways we can use solr indexes across machines or by using Hadoop Distributed File System? Its has been mentioned in the wiki that When an index becomes too large to fit on a single system, or when a single query takes too long to execute, an index can be split into multiple shards, and Solr can query and merge results across those shards. what i understand is that shards are a partition. are shards on the same machine or can it be on different machines?? do we have to manually split the indexes to store in different shards. do you have an example or some tutorial which demonstrates distributed index searching/ storing using shards? Regards, Raakhi
Few Queries regarding indexes in Solr
Hi, 1. Is it possible to query from another index folder (say index1) in solr? 2. Is it possible to query 2 indexes(folders index1 and index2) stored in the same machine using the same port on a single solr instance? 3. consider a case: i have indexes in 2 shards, and i merge the indexes (present in 2 shards) onto the 3rd shard now i add more documents into shard1 and delete some documents from shard 2 and update the indexes. is it possible to send the differences only into shard 3 and then merge it at shard 3? Regards, Raakhi
Re: Distributed querying using solr multicore.
On Thu, Jun 18, 2009 at 3:51 PM, Michael Ludwig m...@as-guides.com wrote: Rakhi Khatwani schrieb: [...] how do we do a distributed search across multicores?? is it just like how we query using multiple shards? I don't know how we're supposed to use it. I did the following: http://flunder:8983/solr/xpg/select?q=blashards=flunder:8983/solr/xpg,flunder:8983/solr/kk i am gettin a page load error... cannot find server For SolrJ, see this thread: Using SolrJ with multicore/shards - ahammad http://markmail.org/thread/qnytfrk4dytmgjis if so, isnt there a better way to do that? No idea. Michael Ludwig
Re: Distributed querying using solr multicore.
Hi Michael, Sorry for the misinterpretation. in that case, its the same like querying multiple shards. :) Thanks, Raakhi On Thu, Jun 18, 2009 at 4:09 PM, Michael Ludwig m...@as-guides.com wrote: Rakhi Khatwani schrieb: On Thu, Jun 18, 2009 at 3:51 PM, Michael Ludwig m...@as-guides.com wrote: I don't know how we're supposed to use it. I did the following: http://flunder:8983/solr/xpg/select?q=blashards=flunder:8983/solr/xpg,flunder:8983/solr/kk i am gettin a page load error... cannot find server This is not a public server, just an example for the syntax I found by trial and error. Michael Ludwig
java.lang.ClassCastException: java.lang.Long cannot be cast to org.apache.solr.common.util.NamedList in solr
Hi, I am integrating solr with hadoop. so i wrote a map reduce method which writes the indexes in HDFS. the map methods work fine, and in my reduce method, i call solrServer to update the indexes, but when i try accessing solrServer, i get the following exception java.lang.ClassCastException: java.lang.Long cannot be cast to org.apache.solr.common.util.NamedList at org.apache.solr.common.util.NamedListCodec.unmarshal(NamedListCodec.java:89) at org.apache.solr.client.solrj.impl.BinaryResponseParser.processResponse(BinaryResponseParser.java:39) at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:385) at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:183) at org.apache.solr.client.solrj.request.UpdateRequest.process(UpdateRequest.java:217) at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:48) at org.apache.solr.indexer.solr.SolrWriter.close(SolrWriter.java:53) at org.apache.solr.indexer.IndexerOutputFormat$1.close(IndexerOutputFormat.java:32) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:441) at org.apache.hadoop.mapred.Child.main(Child.java:155) any suggesstions? thnx Raakhi.
Re: java.lang.ClassCastException: java.lang.Long cannot be cast to org.apache.solr.common.util.NamedList in solr
Hi Noble, i am using solr 1.3 Regards, Raakhi 2009/6/30 Noble Paul നോബിള് नोब्ळ् noble.p...@corp.aol.com which version of solr are you using? 2009/6/30 Noble Paul നോബിള് नोब्ळ् noble.p...@corp.aol.com: use the XMLResponseParser http://wiki.apache.org/solr/Solrj#head-12c26b2d7806432c88b26cf66e236e9bd6e91849 I guess there was some error during the update On Tue, Jun 30, 2009 at 3:33 PM, Rakhi Khatwanirkhatw...@gmail.com wrote: Hi, I am integrating solr with hadoop. so i wrote a map reduce method which writes the indexes in HDFS. the map methods work fine, and in my reduce method, i call solrServer to update the indexes, but when i try accessing solrServer, i get the following exception java.lang.ClassCastException: java.lang.Long cannot be cast to org.apache.solr.common.util.NamedList at org.apache.solr.common.util.NamedListCodec.unmarshal(NamedListCodec.java:89) at org.apache.solr.client.solrj.impl.BinaryResponseParser.processResponse(BinaryResponseParser.java:39) at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:385) at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:183) at org.apache.solr.client.solrj.request.UpdateRequest.process(UpdateRequest.java:217) at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:48) at org.apache.solr.indexer.solr.SolrWriter.close(SolrWriter.java:53) at org.apache.solr.indexer.IndexerOutputFormat$1.close(IndexerOutputFormat.java:32) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:441) at org.apache.hadoop.mapred.Child.main(Child.java:155) any suggesstions? thnx Raakhi. -- - Noble Paul | Principal Engineer| AOL | http://aol.com -- - Noble Paul | Principal Engineer| AOL | http://aol.com
Adding shards entries in solrconfig.xml
Hi, I read the following article: http://www.lucidimagination.com/Community/Hear-from-the-Experts/Articles/Scaling-Lucene-and-Solr its mentioned that its much easier to set the shards parameter for your SearchHandler in solrcofig.xml. i also went through: http://www.nabble.com/newbie-question-on-SOLR-distributed-searches-with-many-%22shards%22-td20687487.html but it gives a wage idea about setting the shards. particularly the syntax. Can anyone given an example of setting the shards parameter in solrconfig.xml. Regards, Raakhi
Tagging and searching on tagged indexes.
Hi, How do we tag solr indexes and search on those indexes, there is not much information on wiki. all i could find is this: http://wiki.apache.org/solr/UserTagDesign has anyone tried it? (using solr API) One more question, can we change the schema dynamically at runtime? (while solr instance is on??) Regards, Raakhi.
Behaviour when we get more than 1 million hits
Hi, If while using Solr, what would the behaviour be like if we perform the search and we get more than one million hits Regards, Raakhi
Re: Behaviour when we get more than 1 million hits
Hi, There is this particulat scenarion where I want to search for a product and i get a million records which will be given for further processing. Regards, Raakhi On Mon, Jul 13, 2009 at 7:33 PM, Erick Erickson erickerick...@gmail.comwrote: It depends (tm) on what you try to do with the results. You really need togive us some more details on what you want to *do* with 1,000,000 hits before any meaningful response is possible. Best Erick On Mon, Jul 13, 2009 at 8:47 AM, Rakhi Khatwani rkhatw...@gmail.com wrote: Hi, If while using Solr, what would the behaviour be like if we perform the search and we get more than one million hits Regards, Raakhi
solr reporting tool adapter
Hi, i wanted to query solr and send the output some reporting tool. has anyone done something like that? moreover, which reporting filter is good?? ny suggesstions? Regards, Raakhi
Re: solr reporting tool adapter
we basically wanna generate PDF reports which contain, tag clouds, bar charts, pie charts etc. Regards, Raakhi On Wed, Oct 7, 2009 at 1:28 PM, Shalin Shekhar Mangar shalinman...@gmail.com wrote: On Tue, Oct 6, 2009 at 1:09 PM, Rakhi Khatwani rkhatw...@gmail.com wrote: Hi, i wanted to query solr and send the output some reporting tool. has anyone done something like that? moreover, which reporting filter is good?? ny suggesstions? Can you be more specific on what you want to achieve? What kind of reports are you looking for? -- Regards, Shalin Shekhar Mangar.
Re: solr reporting tool adapter
Hi Lance, thnx a tonwill look into BIRT Regards, Raakhi On Thu, Oct 8, 2009 at 1:22 AM, Lance Norskog goks...@gmail.com wrote: The BIRT project can do what you want. It has a nice form creator and you can configure http XML input formats. It includes very complete Eclipse plugins and there is a book about it. On 10/7/09, Shalin Shekhar Mangar shalinman...@gmail.com wrote: On Wed, Oct 7, 2009 at 2:51 PM, Rakhi Khatwani rkhatw...@gmail.com wrote: we basically wanna generate PDF reports which contain, tag clouds, bar charts, pie charts etc. Faceting on a field will give you top terms and frequency information which can be used to create tag clouds. What do you want to plot on a bar chart? I don't know of a reporting tool which can hook into Solr for creating such things. -- Regards, Shalin Shekhar Mangar. -- Lance Norskog goks...@gmail.com
issue in adding data to a multivalued field
Hi, i have a small schema with some of the fields defined as: field name=id type=string indexed=true stored=true multiValued=false required=true/ field name=content type=text indexed=true stored=true multivalued=false / field name=author_name type=text indexed=true stored=false multivalued=true/ where the field author_name is multivalued. however in UI (schema browser), following r the details of author_name field, its nowhere mentioned tht its multivalued. Field: author_name Field Type: text Properties: Indexed, Tokenized when i try creating and adding a document into solr, i get an exception ERROR_id1_multiple_values_encountered_for_non_multiValued_field_author_name_ninad_raakhi_goureya_sheetal here's my code snippet: solrDoc17.addField(id, id1); solrDoc17.addField(content, SOLR); solrDoc17.addField(author_name, ninad); solrDoc17.addField(author_name, raakhi); solrDoc17.addField(author_name, goureya); solrDoc17.addField(author_name, sheetal); server.add(solrDoc17); server.commit(); ny pointers?? regards, Raakhi