RE: Solr Cloud with large synonyms.txt

2013-05-08 Thread David Parks
I can see your point, though I think edge cases would be one concern, if
someone *can* create a very large synonyms file, someone *will* create that
file.  What  would you set the zookeeper max data size to be? 50MB? 100MB?
Someone is going to do something bad if there's nothing to tell them not to.
Today solr cloud just crashes if you try to create a modest sized synonyms
file, clearly at a minimum some zookeeper settings should be configured out
of the box.  Any reasonable setting you come up with for zookeeper is
virtually guaranteed to fail for some percentage of users over a reasonably
sized user-base (which solr has).

What if I plugged in a 200MB synonyms file just for testing purposes (I
don't care about performance implications)?  I don't think most users would
catch the footnote in the docs that calls out a max synonyms file size.

Dave


-Original Message-
From: Mark Miller [mailto:markrmil...@gmail.com] 
Sent: Tuesday, May 07, 2013 11:53 PM
To: solr-user@lucene.apache.org
Subject: Re: Solr Cloud with large synonyms.txt

I'm not so worried about the large file in zk issue myself.

The concern is that you start storing and accessing lots of large files in
ZK. This is not what it was made for, and everything stays in RAM, so they
guard against this type of usage.

We are talking about a config file that is loaded on Core load though. It's
uploaded and read very rarely. On modern hardware and networks, making that
file 5MB rather than 1MB is not going to ruin your day. It just won't. Solr
does not use ZooKeeper heavily - in a steady state cluster, it doesn't read
or write from ZooKeeper at all to any degree that registers. I'm going to
have to see problems loading these larger config files from ZooKeeper before
I'm worried that it's a problem.

- Mark

On May 7, 2013, at 12:21 PM, Son Nguyen s...@trancorp.com wrote:

 Mark,
 
 I tried to set that property on both ZK (I have only one ZK instance) and
Solr, but it still didn't work.
 But I read somewhere that ZK is not really designed for keeping large data
files, so this solution - increasing jute.maxbuffer (if I can implement it)
should be just temporary.
 
 Son
 
 -Original Message-
 From: Mark Miller [mailto:markrmil...@gmail.com] 
 Sent: Tuesday, May 07, 2013 9:35 PM
 To: solr-user@lucene.apache.org
 Subject: Re: Solr Cloud with large synonyms.txt
 
 
 On May 7, 2013, at 10:24 AM, Mark Miller markrmil...@gmail.com wrote:
 
 
 On May 6, 2013, at 12:32 PM, Son Nguyen s...@trancorp.com wrote:
 
 I did some researches on internet and found out that because Zookeeper
znode size limit is 1MB. I tried to increase the system property
jute.maxbuffer but it won't work.
 Does anyone have experience of dealing with it?
 
 Perhaps hit up the ZK list? They doc it as simply raising jute.maxbuffer,
though you have to do it for each ZK instance.
 
 - Mark
 
 
 the system property must be set on all servers and clients otherwise
problems will arise.
 
 Make sure you try passing it both to ZK *and* to Solr.
 
 - Mark
 



Re: stats cache

2013-05-08 Thread J Mohamed Zahoor
Thanks.. i am caching in HTTP now..

./zahoor


On 08-May-2013, at 3:58 AM, Yonik Seeley yo...@lucidworks.com wrote:

 On Tue, May 7, 2013 at 12:48 PM, J Mohamed Zahoor zah...@indix.com wrote:
 Hi
 
 I am computing lots of stats as part of a query…
 looks like the solr caching is not helping here…
 
 Does solr caches stats of a query?
 
 No.  Neither facet counts or stats part of a request are cached.  The
 query cache only caches top N docs (plus scores if applicable) for a
 given query + filters.
 
 If the whole request is identical, then you can use an HTTP caching
 mechanism though.
 
 -Yonik
 http://lucidworks.com



RE: Solr Cloud with large synonyms.txt

2013-05-08 Thread Roman Chyla
David, have you seen the finite state automata the synonym lookup is built
on? The lookup is very efficient and fast. You have a point though, it is
going to fail for someone.
Roman
On 8 May 2013 03:11, David Parks davidpark...@yahoo.com wrote:

 I can see your point, though I think edge cases would be one concern, if
 someone *can* create a very large synonyms file, someone *will* create that
 file.  What  would you set the zookeeper max data size to be? 50MB? 100MB?
 Someone is going to do something bad if there's nothing to tell them not
 to.
 Today solr cloud just crashes if you try to create a modest sized synonyms
 file, clearly at a minimum some zookeeper settings should be configured out
 of the box.  Any reasonable setting you come up with for zookeeper is
 virtually guaranteed to fail for some percentage of users over a reasonably
 sized user-base (which solr has).

 What if I plugged in a 200MB synonyms file just for testing purposes (I
 don't care about performance implications)?  I don't think most users would
 catch the footnote in the docs that calls out a max synonyms file size.

 Dave


 -Original Message-
 From: Mark Miller [mailto:markrmil...@gmail.com]
 Sent: Tuesday, May 07, 2013 11:53 PM
 To: solr-user@lucene.apache.org
 Subject: Re: Solr Cloud with large synonyms.txt

 I'm not so worried about the large file in zk issue myself.

 The concern is that you start storing and accessing lots of large files in
 ZK. This is not what it was made for, and everything stays in RAM, so they
 guard against this type of usage.

 We are talking about a config file that is loaded on Core load though. It's
 uploaded and read very rarely. On modern hardware and networks, making that
 file 5MB rather than 1MB is not going to ruin your day. It just won't. Solr
 does not use ZooKeeper heavily - in a steady state cluster, it doesn't read
 or write from ZooKeeper at all to any degree that registers. I'm going to
 have to see problems loading these larger config files from ZooKeeper
 before
 I'm worried that it's a problem.

 - Mark

 On May 7, 2013, at 12:21 PM, Son Nguyen s...@trancorp.com wrote:

  Mark,
 
  I tried to set that property on both ZK (I have only one ZK instance) and
 Solr, but it still didn't work.
  But I read somewhere that ZK is not really designed for keeping large
 data
 files, so this solution - increasing jute.maxbuffer (if I can implement it)
 should be just temporary.
 
  Son
 
  -Original Message-
  From: Mark Miller [mailto:markrmil...@gmail.com]
  Sent: Tuesday, May 07, 2013 9:35 PM
  To: solr-user@lucene.apache.org
  Subject: Re: Solr Cloud with large synonyms.txt
 
 
  On May 7, 2013, at 10:24 AM, Mark Miller markrmil...@gmail.com wrote:
 
 
  On May 6, 2013, at 12:32 PM, Son Nguyen s...@trancorp.com wrote:
 
  I did some researches on internet and found out that because Zookeeper
 znode size limit is 1MB. I tried to increase the system property
 jute.maxbuffer but it won't work.
  Does anyone have experience of dealing with it?
 
  Perhaps hit up the ZK list? They doc it as simply raising
 jute.maxbuffer,
 though you have to do it for each ZK instance.
 
  - Mark
 
 
  the system property must be set on all servers and clients otherwise
 problems will arise.
 
  Make sure you try passing it both to ZK *and* to Solr.
 
  - Mark
 




Re: Search identifier fields containing blanks

2013-05-08 Thread Silvio Hermann

I will give it a go!

thank you

best

Silvio

On 05/08/2013 03:07 AM, Chris Hostetter wrote:


: I am about to index identfier fields containing blanks (shelfmarks) eg. G
: 23/60 12
: The field type is set to Solr.string. To get the exact matching hit (the doc
: with shelfmark mentioned above) the user must quote the search term. Is there
: a way to omit the quotes?

whitespace has to be quoted when using the lucene QParser because it's a
semanticly significant character that means end boolean query clause

if you want to search for a literal string w/o needing any escaping, use
the term QParser...

{!term f=yourFieldName}G 23/60 12

Of course, if you are putting this in a URL (ie: testing in a browser) it
still needs to be URL escaped...

/select?q={!term+f=yourFieldName}G+23/60+12


-Hoss



--
Silvio Hermann
Friedrich-Schiller-Universität Jena
Thüringer Universitäts- und Landesbibliothek
Bibliotheksplatz 2
07743 Jena
Phone: +49 3641 940019
FAX:   +49 3641 940022

http://www.historische-bestaende.de


Re: Indexing Point Number

2013-05-08 Thread Rafał Kuć
Hello!

Use a float field type in your schema.xml file, for example like this:
fieldType name=float class=solr.TrieFloatField precisionStep=0 
positionIncrementGap=0/

Define a field using this type:
field name=price  type=float indexed=true stored=true/

You'll be able to index data like this:
field name=price19.95/field

-- 
Regards,
 Rafał Kuć
 Sematext :: http://sematext.com/ :: Solr - Lucene - Nutch - ElasticSearch

  Hi,
  how can I indexing numbers with decimal point.
  For example:
  5,50
  109,90

  I want to sort the numbers.

  Thanks

  



Re: Indexing Point Number

2013-05-08 Thread be...@bkern.de

I will index for example:
field name=price19,95/field
field name=price25,45/field

I can only float with numbers with dots indexing.

Thanks

Am Mittwoch, den 08.05.2013, 10:52 +0200 schrieb Rafał Kuć 
r@solr.pl:

Hello!

Use a float field type in your schema.xml file, for example like 
this:

fieldType name=float class=solr.TrieFloatField precisionStep=0
positionIncrementGap=0/

Define a field using this type:
field name=price  type=float indexed=true stored=true/

You'll be able to index data like this:
field name=price19.95/field

--
Regards,
 Rafał Kuć
 Sematext :: http://sematext.com/ :: Solr - Lucene - Nutch - 
ElasticSearch



 Hi,
 how can I indexing numbers with decimal point.
 For example:
 5,50
 109,90



 I want to sort the numbers.



 Thanks






Re: Lazy load Error on UI analysis area

2013-05-08 Thread yriveiro
Ok, I will do a fresh install in a VM and check that the error isn't
reproduce.



-
Best regards
--
View this message in context: 
http://lucene.472066.n3.nabble.com/Lazy-load-Error-on-UI-analysis-area-tp4061291p4061512.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Indexing Point Number

2013-05-08 Thread Gora Mohanty
On 8 May 2013 14:48, be...@bkern.de be...@bkern.de wrote:
 I will index for example:
 field name=price19,95/field
 field name=price25,45/field

 I can only float with numbers with dots indexing.

I don't think that it is currently possible to change the decimal
separator. You should replace ',' with '.' during indexing, and
searching which should be fairly easy.

Regards,
Gora


Re: Number of search results from SOLR

2013-05-08 Thread Dmitry Kan
If you need just the count of the results found, check the numFound.

If you would like to get all the results possible in one go, you could try
rows=-1. This may impact on your server a lot, so be careful.

If you have a single non-sharded index, try pagination
(start=offsetrows=window_size) instead of asking all results in one go.


Dmitry


On Wed, May 8, 2013 at 3:44 AM, Kamal Palei palei.ka...@gmail.com wrote:

 Dear All
 I am looking for to get maximum number of search results from a given solr
 query.
 How can I get it, kindly give me some pointer.

 Best Regards
 Kamal



Re: stats cache

2013-05-08 Thread Dmitry Kan
Mohamed,

(out of curiosity) What kind of tool are you using for that?


On Wed, May 8, 2013 at 10:13 AM, J Mohamed Zahoor zah...@indix.com wrote:

 Thanks.. i am caching in HTTP now..

 ./zahoor


 On 08-May-2013, at 3:58 AM, Yonik Seeley yo...@lucidworks.com wrote:

  On Tue, May 7, 2013 at 12:48 PM, J Mohamed Zahoor zah...@indix.com
 wrote:
  Hi
 
  I am computing lots of stats as part of a query…
  looks like the solr caching is not helping here…
 
  Does solr caches stats of a query?
 
  No.  Neither facet counts or stats part of a request are cached.  The
  query cache only caches top N docs (plus scores if applicable) for a
  given query + filters.
 
  If the whole request is identical, then you can use an HTTP caching
  mechanism though.
 
  -Yonik
  http://lucidworks.com




Re: Search identifier fields containing blanks

2013-05-08 Thread Silvio Hermann

that worked like a charme, but what must I do if want an additional field to 
match e.g.

{!term f=myFieldName}G 23/60 12 +location:bookshelf

Best,

Silvio



On 05/08/2013 03:07 AM, Chris Hostetter wrote:


: I am about to index identfier fields containing blanks (shelfmarks) eg. G
: 23/60 12
: The field type is set to Solr.string. To get the exact matching hit (the doc
: with shelfmark mentioned above) the user must quote the search term. Is there
: a way to omit the quotes?

whitespace has to be quoted when using the lucene QParser because it's a
semanticly significant character that means end boolean query clause

if you want to search for a literal string w/o needing any escaping, use
the term QParser...

{!term f=yourFieldName}G 23/60 12

Of course, if you are putting this in a URL (ie: testing in a browser) it
still needs to be URL escaped...

/select?q={!term+f=yourFieldName}G+23/60+12


-Hoss



--
Silvio Hermann
Friedrich-Schiller-Universität Jena
Thüringer Universitäts- und Landesbibliothek
Bibliotheksplatz 2
07743 Jena
Phone: +49 3641 940019
FAX:   +49 3641 940022

http://www.historische-bestaende.de


Re: Search identifier fields containing blanks

2013-05-08 Thread Upayavira
If you're using the latest Solr, then you should be able to do it the
other way around:

q=+location:bookshelf {!term f=myFieldName}G 23/60 12

You might also find the trick I mentioned before useful:

q=+location:bookshelf {!term f=myFieldName v=$productCode}productCode=G
23/60 12

Upayavira

On Wed, May 8, 2013, at 11:19 AM, Silvio Hermann wrote:
 that worked like a charme, but what must I do if want an additional field
 to match e.g.
 
 
 
 Best,
 
 Silvio
 
 
 
 On 05/08/2013 03:07 AM, Chris Hostetter wrote:
 
  : I am about to index identfier fields containing blanks (shelfmarks) eg. G
  : 23/60 12
  : The field type is set to Solr.string. To get the exact matching hit (the 
  doc
  : with shelfmark mentioned above) the user must quote the search term. Is 
  there
  : a way to omit the quotes?
 
  whitespace has to be quoted when using the lucene QParser because it's a
  semanticly significant character that means end boolean query clause
 
  if you want to search for a literal string w/o needing any escaping, use
  the term QParser...
 
  {!term f=yourFieldName}G 23/60 12
 
  Of course, if you are putting this in a URL (ie: testing in a browser) it
  still needs to be URL escaped...
 
  /select?q={!term+f=yourFieldName}G+23/60+12
 
 
  -Hoss
 
 
 -- 
 Silvio Hermann
 Friedrich-Schiller-Universität Jena
 Thüringer Universitäts- und Landesbibliothek
 Bibliotheksplatz 2
 07743 Jena
 Phone: +49 3641 940019
 FAX:   +49 3641 940022
 
 http://www.historische-bestaende.de


Re: Indexing Point Number

2013-05-08 Thread Upayavira
You could use a RegexReplaceProcessor in an update processor chain. From
the Javadoc:

 processor class=solr.RegexReplaceProcessorFactory
   str name=fieldNamecontent/str
   str name=fieldNametitle/str
   str name=pattern\s+/str
   str name=replacement /str
 /processor

This could replace the comma with a dot before it gets to be indexed.

Upayavira

On Wed, May 8, 2013, at 10:28 AM, Gora Mohanty wrote:
 On 8 May 2013 14:48, be...@bkern.de be...@bkern.de wrote:
  I will index for example:
  field name=price19,95/field
  field name=price25,45/field
 
  I can only float with numbers with dots indexing.
 
 I don't think that it is currently possible to change the decimal
 separator. You should replace ',' with '.' during indexing, and
 searching which should be fairly easy.
 
 Regards,
 Gora


Re: stats cache

2013-05-08 Thread J Mohamed Zahoor


I am using a simple LRU cache in my client where i store req and response for 
now.
Later might move to something like varnish.

./zahoor

On 08-May-2013, at 3:26 PM, Dmitry Kan solrexp...@gmail.com wrote:

 Mohamed,
 
 (out of curiosity) What kind of tool are you using for that?
 
 
 On Wed, May 8, 2013 at 10:13 AM, J Mohamed Zahoor zah...@indix.com wrote:
 
 Thanks.. i am caching in HTTP now..
 
 ./zahoor
 
 
 On 08-May-2013, at 3:58 AM, Yonik Seeley yo...@lucidworks.com wrote:
 
 On Tue, May 7, 2013 at 12:48 PM, J Mohamed Zahoor zah...@indix.com
 wrote:
 Hi
 
 I am computing lots of stats as part of a query…
 looks like the solr caching is not helping here…
 
 Does solr caches stats of a query?
 
 No.  Neither facet counts or stats part of a request are cached.  The
 query cache only caches top N docs (plus scores if applicable) for a
 given query + filters.
 
 If the whole request is identical, then you can use an HTTP caching
 mechanism though.
 
 -Yonik
 http://lucidworks.com
 
 



Re: Lazy load Error on UI analysis area

2013-05-08 Thread yriveiro
I found the error, the class of analysis field request handler was not set
properly.



-
Best regards
--
View this message in context: 
http://lucene.472066.n3.nabble.com/Lazy-load-Error-on-UI-analysis-area-tp4061291p4061526.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: solr adding unique values

2013-05-08 Thread Nikhil Kumar
Thanks Erick,
   I had a look on
deduplicationhttp://docs.lucidworks.com/display/solr/De-Duplication
 .
I added :
 updateRequestProcessorChain name=dedupe
   processor class=solr.processor.SignatureUpdateProcessorFactory
 bool name=enabledtrue/bool
 str name=signatureFieldlisted_id/str
 bool name=overwriteDupestrue/bool
 str name=fieldslisted/str
 str name=signatureClasssolr.processor.Lookup3Signature/str
   /processor
   processor class=solr.LogUpdateProcessorFactory /
   processor class=solr.RunUpdateProcessorFactory /
 /updateRequestProcessorChain

 requestHandler name=/update class=solr.UpdateRequestHandler 
lst name=defaults
  str name=update.chaindedupe/str
/lst
  /requestHandler

in solrconfig.xml and i added

  field name=listed type=comaSplit indexed=true stored=true
multiValued=true/
  field name=listed_id type=comaSplit indexed=true stored=true
multiValued=true/

in schema.xml. Should i be achieve it in this way, because i could not?
Or should i use a different approach?


On Tue, May 7, 2013 at 10:59 PM, Erick Erickson erickerick...@gmail.comwrote:

 Ah. OK. There's no dedupe values that I know of, I think you'd need to
 implement that yourself by fetching the field in question and doing a set
 on the field.

 You might be able to do that better in a custom update handler.

 Best
 Erick


 On Tue, May 7, 2013 at 6:54 AM, Nikhil Kumar nikhil.ku...@hashedin.comwrote:

 Thanks Erik,
  For the reply ! I know about 'set' but that's not my goal, i had to give
 a better example.
 I want this and if i have to add another list_c
 user a[
 id:a
 liists[
  list_a,
  list_b
]
 ]
 It Should look like:
  user a[
 id:a
 liists[
  list_a,
  list_b,
  list_c
]
 ]
 However if i again add list_a, it should *not* be:
 user a[
 id:a
 liists[
  list_a,
  list_b,
  list_c,
  list_a,
]
 ]
 I am *not* reindexing the documents.

 Depends on your goal here. I'm guessing you're using
 atomic updates, in which case you need to use set
 rather than add as the former replaces the contents.
 See: http://wiki.apache.org/solr/UpdateJSON#Solr_4.0_Example

 If you're simply re-indexing the documents, just send the entire
 fresh document to solr and it'll replace the earlier document
 completely.

 Best
 Erick


 On Mon, May 6, 2013 at 1:44 PM, Nikhil Kumar 
 nikhil.ku...@hashedin.comwrote:

 Hey,
I have recently started using solr, I have a list of users, which are
 subscribed to some lists.
 eg.
 user a[
 id:a
 liists[
  list_a
]
 ]
 user b[
 id:b
 liists[
  list_a
]
 ]
 I am using {id: a, lists:{add:list_a}} to add particular list a
 user.
 but what is happening if I use the same command again, it again adds the
 same list, which i want to avoid.
 user a[
 id:a
 liists[
  list_a,
  list_a
]
 ]
 I searched the documentation and tutorials, i found

-

overwrite = true | false — default is true, meaning newer
documents will replace previously added documents with the same 
 uniqueKey.
-

commitWithin = (milliseconds) if the commitWithin attribute is
present, the document will be added within that time. [image: !]
Solr1.4 http://wiki.apache.org/solr/Solr1.4. See 
 CommitWithinhttp://wiki.apache.org/solr/CommitWithin
-

(deprecated) allowDups = true | false — default is false
-

(deprecated) overwritePending = true | false — default is
negation of allowDups
-

(deprecated) overwriteCommitted = true|false — default is
negation of allowDups


but using overwrite and allowDups didn't solve the problem either,
seems because there is no unique id but just value.

So the question is how to solve this problem?

 --
 Thank You and Regards,
 Nikhil Kumar
 +91-9916343619
 Technical Analyst
 Hashed In Technologies Pvt. Ltd.




 --
 Thank You and Regards,
 Nikhil Kumar
  +91-9916343619
 Technical Analyst
 Hashed In Technologies Pvt. Ltd.





-- 
Thank You and Regards,
Nikhil Kumar
+91-9916343619
Technical Analyst
Hashed In Technologies Pvt. Ltd.


write own query analyser

2013-05-08 Thread neha yadav
hi all,

I need to analyse the query sent to solr . I need to parse the query
through a pipline made through uima.

can anyone help me understand , how do i do this.

I have already created an Aggregate Analyzer in uima, now needs to run a
solr input query through this, to increase relevancy in output.

if this is already done, then please direct me to any link.

Thanks in advance,

Neha Yadav


java.lang.IllegalArgumentException: No enum const class org.apache.lucene.util.Version.LUCENE_43

2013-05-08 Thread Roald
Hi all,

I just reported this issue: http://issues.apache.org/jira/browse/SOLR-4800

java.lang.IllegalArgumentException: No enum const class
org.apache.lucene.util.Version.LUCENE_43

solr-4.3.0/example/solr/collection1/conf/solrconfig.xml has
luceneMatchVersionLUCENE_43/luceneMatchVersion

Which causes:

SolrCore Initialization Failures

collection1:
org.apache.solr.common.SolrException:org.apache.solr.common.SolrException:
Could not load config for solrconfig.xml

From catalina.out :

SEVERE: Unable to create core: collection1
org.apache.solr.common.SolrException: Could not load config for
solrconfig.xml
at
org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:991)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1051)
at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:634)
at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:629)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:679)
Caused by: org.apache.solr.common.SolrException: Invalid luceneMatchVersion
'LUCENE_43', valid values are: [LUCENE_30, LUCENE_31, LUCENE_32, LUCENE_33,
LUCENE_34, LUCENE_35, LUCENE_36, LUCENE_40, LUCENE_41, LUCENE_42,
LUCENE_CURRENT] or a string in format 'V.V'
at org.apache.solr.core.Config.parseLuceneVersionString(Config.java:313)
at org.apache.solr.core.Config.getLuceneVersion(Config.java:298)
at org.apache.solr.core.SolrConfig.init(SolrConfig.java:119)
at
org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:989)
... 11 more
Caused by: java.lang.IllegalArgumentException: No enum const class
org.apache.lucene.util.Version.LUCENE_43
at java.lang.Enum.valueOf(Enum.java:214)
at org.apache.lucene.util.Version.valueOf(Version.java:34)
at org.apache.lucene.util.Version.parseLeniently(Version.java:133)
at org.apache.solr.core.Config.parseLuceneVersionString(Config.java:311)
... 14 more
May 7, 2013 9:10:00 PM org.apache.solr.common.SolrException log
SEVERE: null:org.apache.solr.common.SolrException: Unable to create core:
collection1
at
org.apache.solr.core.CoreContainer.recordAndThrow(CoreContainer.java:1672)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1057)
at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:634)
at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:629)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:679)
Caused by: org.apache.solr.common.SolrException: Could not load config for
solrconfig.xml
at
org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:991)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1051)
... 10 more
Caused by: org.apache.solr.common.SolrException: Invalid luceneMatchVersion
'LUCENE_43', valid values are: [LUCENE_30, LUCENE_31, LUCENE_32, LUCENE_33,
LUCENE_34, LUCENE_35, LUCENE_36, LUCENE_40, LUCENE_41, LUCENE_42,
LUCENE_CURRENT] or a string in format 'V.V'
at org.apache.solr.core.Config.parseLuceneVersionString(Config.java:313)
at org.apache.solr.core.Config.getLuceneVersion(Config.java:298)
at org.apache.solr.core.SolrConfig.init(SolrConfig.java:119)
at
org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:989)
... 11 more
Caused by: java.lang.IllegalArgumentException: No enum const class
org.apache.lucene.util.Version.LUCENE_43
at java.lang.Enum.valueOf(Enum.java:214)
at org.apache.lucene.util.Version.valueOf(Version.java:34)
at org.apache.lucene.util.Version.parseLeniently(Version.java:133)
at org.apache.solr.core.Config.parseLuceneVersionString(Config.java:311)
... 14 more

If I change LUCENE_43 to LUCENE_42 it works. The admin webpage reports the
following versions:

solr-spec : 4.2.1.2013.03.26.08.26.55
solr-impl : 4.2.1 1461071 - mark - 2013-03-26 08:26:55
lucene-spec : 4.2.1
lucene-impl : 4.2.1 1461071 - mark - 2013-03-26 08:23:34
Thank you very much in advance!

Regards,
Roald


Re: stats cache

2013-05-08 Thread Dmitry Kan
OK, thanks.


On Wed, May 8, 2013 at 1:38 PM, J Mohamed Zahoor zah...@indix.com wrote:



 I am using a simple LRU cache in my client where i store req and response
 for now.
 Later might move to something like varnish.

 ./zahoor

 On 08-May-2013, at 3:26 PM, Dmitry Kan solrexp...@gmail.com wrote:

  Mohamed,
 
  (out of curiosity) What kind of tool are you using for that?
 
 
  On Wed, May 8, 2013 at 10:13 AM, J Mohamed Zahoor zah...@indix.com
 wrote:
 
  Thanks.. i am caching in HTTP now..
 
  ./zahoor
 
 
  On 08-May-2013, at 3:58 AM, Yonik Seeley yo...@lucidworks.com wrote:
 
  On Tue, May 7, 2013 at 12:48 PM, J Mohamed Zahoor zah...@indix.com
  wrote:
  Hi
 
  I am computing lots of stats as part of a query…
  looks like the solr caching is not helping here…
 
  Does solr caches stats of a query?
 
  No.  Neither facet counts or stats part of a request are cached.  The
  query cache only caches top N docs (plus scores if applicable) for a
  given query + filters.
 
  If the whole request is identical, then you can use an HTTP caching
  mechanism though.
 
  -Yonik
  http://lucidworks.com
 
 




Re: Issue with fuzzy search in Distributed Search

2013-05-08 Thread meghana
Please help me on this!! 


meghana wrote
 To ensure the all records exist in single node, i queried on specific
 duration, so , for shards core and simple core query, results should be
 similar. 
 
 as you suggested, i analyzed the debugQuery for one specific search 
*
 text:worde~1
*
 , and I seen that the record which returns in shards core have highlights
 like 
*
 word
*
 , 
*
 words
*
 , 
*
 word!n
*
 . but when I look in debugQuery it just processing for 
*
 word!n
*
 , and was not processing  other highlights (words, word), although it
 shows it in highlight for that record. and so, shards core do not return
 other records , having text as 
*
 word
*
  or 
*
 words
*
  , but not 
*
 word!n
*
  in it. 
 
 on the other case, the simple core processing all 
*
 word
*
 , 
*
 words
*
 , 
*
 word!n
*
 , and return proper results.  this seems very weird behavior, any
 suggestion ? 
 
 Jack Krupansky-2 wrote
 A fuzzy query itself does not know about distributed search - Lucene
 simply 
 scores the query results based on the local index. Then, Solr is merging
 the 
 merging the query results from different nodes.
 
 Try the query locally for each node and set debugQuery=true and see how
 each 
 document gets scored.
 
 I'm actually not sure what the specific problem (symptom) is that you
 are 
 seeing. I mean, maybe there is only 1 result on that node - how do you
 know 
 otherwise?? Or maybe one node has more exact matches.
 
 -- Jack Krupansky
 
 -Original Message- 
 From: meghana
 Sent: Tuesday, April 30, 2013 7:51 AM
 To: 

 solr-user@.apache

 Subject: Issue with fuzzy search in Distributed Search
 
 I have created 2 versions of Solr core in different servers. one is
 simple
 core having all records in one core. And other is shards core,
 distributed
 over 3 cores on server.
 
 Simple core :
 
 http://localhost:8080/sorl/core0/select?q=text:hoers~1
 
 Distributed core :
 
 http://192.168.1.91:8080/core0/select?shards=http://192.168.1.91:8080/core0,http://192.168.1.91:8080/core1,http://192.168.1.91:8080/core2q=text:hoers~1
 
 data, schema and other configuration is similar in both the cores.
 
 but while doing fuzzy search like hoers~1 one core returns many
 records(about 450), while other core return only 1 record.
 
 While this issue does not seem related to Distributed Search, as Although
 i
 do not use distributed search, then also it do not return more rows.
 
 as http://192.168.1.91:8080/core0/select?q=text:hoers~1
 
 below is schema definition for my field.
 fieldType name=text_en_splitting class=solr.TextField
 positionIncrementGap=100 autoGeneratePhraseQueries=true
   
 analyzer type=index
   
 tokenizer class=solr.WhitespaceTokenizerFactory/
 
 filter class=solr.StopFilterFactory
 ignoreCase=true
 words=stopwords.txt
 enablePositionIncrements=false
 /
 
 filter class=solr.StopFilterFactory
 ignoreCase=true
 words=stopwords_en.txt
 enablePositionIncrements=true
 /
 
 filter class=solr.WordDelimiterFilterFactory
 generateWordParts=1 generateNumberParts=1 catenateWords=1
 catenateNumbers=1 catenateAll=0 splitOnCaseChange=1
 protected=protwords.txt types=wdfftypes.txt  /
 
 filter class=solr.LowerCaseFilterFactory/
 
 filter class=solr.KeywordMarkerFilterFactory
 protected=protwords.txt/
 
 filter class=solr.PorterStemFilterFactory/
   
 /analyzer
   
 analyzer type=query
 
 tokenizer class=solr.WhitespaceTokenizerFactory/
 
 filter class=solr.SynonymFilterFactory synonyms=synonyms.txt
 ignoreCase=true expand=true/
 
 filter class=solr.StopFilterFactory
 ignoreCase=true
 words=stopwords_extra_query.txt
 enablePositionIncrements=false
 /
 
 filter class=solr.StopFilterFactory
 ignoreCase=true
 words=stopwords_en.txt
 enablePositionIncrements=true
 /
 
 filter class=solr.WordDelimiterFilterFactory
 generateWordParts=1 generateNumberParts=1 catenateWords=0
 catenateNumbers=0 catenateAll=0 splitOnCaseChange=1
 protected=protwords.txt types=wdfftypes.txt  /
 
 filter class=solr.LowerCaseFilterFactory/
 
 filter class=solr.KeywordMarkerFilterFactory
 protected=protwords.txt/
 
 filter class=solr.PorterStemFilterFactory/
   
 /analyzer
 
 /fieldType
 Not sure, what is wrong with this. Can anybody help me on this??
 
 
 
 
 --
 View this message in context: 
 http://lucene.472066.n3.nabble.com/Issue-with-fuzzy-search-in-Distributed-Search-tp4060022.html
 Sent from the Solr - User mailing list archive at Nabble.com.





--
View this message in context: 
http://lucene.472066.n3.nabble.com/Solr-Results-differ-in-2-solr-cores-same-configuration-for-fuzzy-search-tp4060022p4061545.html
Sent from the Solr - User 

Re: Get Suggester to return same phrase as query

2013-05-08 Thread Rounak Jain
Thanks, Erick. The link you gave me is mostly about getting Suggester
working with Phrases, which I've already done with queryAnalyzerFieldType
and no custom code.

What my main issue is that the query itself isn't getting returned *if *it
is an actual word/token in my index. So for example if a user begins typing
in w, wo, wom and so on, I wouldn't like those to appear in the suggestion
list, but if he or she types women, which is a legitimate word and very
likely appears frequently in my index, I'd like it to be returned in the
suggestion list.

I want to know if there's any way to configure Solr's Suggester to behave
this way apart from modifying the source.

Thanks,

Rounak


On Tue, May 7, 2013 at 11:48 PM, Erick Erickson erickerick...@gmail.comwrote:

 Hmmm, R. Muir did some work here:
 https://issues.apache.org/jira/browse/SOLR-3143, note that it's 4.0 or
 later. I haven't implemented this, but this is a common problem so if
 you do dig into it and get it to work (warning, I haven't a clue) it'd
 be a great contribution to the Wiki.

 Best
 Erick

 On Tue, May 7, 2013 at 10:41 AM, Rounak Jain rouna...@gmail.com wrote:
  Hi,
 
  I'm using the Suggester component in Solr, and if I search for iPhone 5
  the suggestions never give me the same phrase, that is iPhone 5. Is
 there
  any way to alter this behaviour to return iPhone 5 as well?
 
  A backup option could be to always display what the user has entered in
 the
  UI, but I want it to be displayed *only *if there are results for it in
  Solr, which is only possible if Solr returns the term.
 
  Rounak



Re: java.lang.IllegalArgumentException: No enum const class org.apache.lucene.util.Version.LUCENE_43

2013-05-08 Thread Alan Woodward
Hi Roald,

On the ticket, you report the following version information:
solr-spec : 4.2.1.2013.03.26.08.26.55
solr-impl : 4.2.1 1461071 - mark - 2013-03-26 08:26:55
lucene-spec : 4.2.1
lucene-impl : 4.2.1 1461071 - mark - 2013-03-26 08:23:34

This shows that your servlet container is running 4.2.1, not 4.3.  So the 
example solrconfig.xml from 4.3 won't work here.

Alan Woodward
www.flax.co.uk


On 8 May 2013, at 12:52, Roald wrote:

 Hi all,
 
 I just reported this issue: http://issues.apache.org/jira/browse/SOLR-4800
 
 java.lang.IllegalArgumentException: No enum const class
 org.apache.lucene.util.Version.LUCENE_43
 
 solr-4.3.0/example/solr/collection1/conf/solrconfig.xml has
 luceneMatchVersionLUCENE_43/luceneMatchVersion
 
 Which causes:
 
 SolrCore Initialization Failures
 
 collection1:
 org.apache.solr.common.SolrException:org.apache.solr.common.SolrException:
 Could not load config for solrconfig.xml
 
 From catalina.out :
 
 SEVERE: Unable to create core: collection1
 org.apache.solr.common.SolrException: Could not load config for
 solrconfig.xml
 at
 org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:991)
 at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1051)
 at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:634)
 at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:629)
 at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
 at java.util.concurrent.FutureTask.run(FutureTask.java:166)
 at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
 at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
 at java.util.concurrent.FutureTask.run(FutureTask.java:166)
 at
 java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
 at
 java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
 at java.lang.Thread.run(Thread.java:679)
 Caused by: org.apache.solr.common.SolrException: Invalid luceneMatchVersion
 'LUCENE_43', valid values are: [LUCENE_30, LUCENE_31, LUCENE_32, LUCENE_33,
 LUCENE_34, LUCENE_35, LUCENE_36, LUCENE_40, LUCENE_41, LUCENE_42,
 LUCENE_CURRENT] or a string in format 'V.V'
 at org.apache.solr.core.Config.parseLuceneVersionString(Config.java:313)
 at org.apache.solr.core.Config.getLuceneVersion(Config.java:298)
 at org.apache.solr.core.SolrConfig.init(SolrConfig.java:119)
 at
 org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:989)
 ... 11 more
 Caused by: java.lang.IllegalArgumentException: No enum const class
 org.apache.lucene.util.Version.LUCENE_43
 at java.lang.Enum.valueOf(Enum.java:214)
 at org.apache.lucene.util.Version.valueOf(Version.java:34)
 at org.apache.lucene.util.Version.parseLeniently(Version.java:133)
 at org.apache.solr.core.Config.parseLuceneVersionString(Config.java:311)
 ... 14 more
 May 7, 2013 9:10:00 PM org.apache.solr.common.SolrException log
 SEVERE: null:org.apache.solr.common.SolrException: Unable to create core:
 collection1
 at
 org.apache.solr.core.CoreContainer.recordAndThrow(CoreContainer.java:1672)
 at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1057)
 at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:634)
 at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:629)
 at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
 at java.util.concurrent.FutureTask.run(FutureTask.java:166)
 at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
 at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
 at java.util.concurrent.FutureTask.run(FutureTask.java:166)
 at
 java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
 at
 java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
 at java.lang.Thread.run(Thread.java:679)
 Caused by: org.apache.solr.common.SolrException: Could not load config for
 solrconfig.xml
 at
 org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:991)
 at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1051)
 ... 10 more
 Caused by: org.apache.solr.common.SolrException: Invalid luceneMatchVersion
 'LUCENE_43', valid values are: [LUCENE_30, LUCENE_31, LUCENE_32, LUCENE_33,
 LUCENE_34, LUCENE_35, LUCENE_36, LUCENE_40, LUCENE_41, LUCENE_42,
 LUCENE_CURRENT] or a string in format 'V.V'
 at org.apache.solr.core.Config.parseLuceneVersionString(Config.java:313)
 at org.apache.solr.core.Config.getLuceneVersion(Config.java:298)
 at org.apache.solr.core.SolrConfig.init(SolrConfig.java:119)
 at
 org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:989)
 ... 11 more
 Caused by: java.lang.IllegalArgumentException: No enum const class
 org.apache.lucene.util.Version.LUCENE_43
 at java.lang.Enum.valueOf(Enum.java:214)
 at org.apache.lucene.util.Version.valueOf(Version.java:34)
 at org.apache.lucene.util.Version.parseLeniently(Version.java:133)
 at 

Oracle Timestamp in SOLR

2013-05-08 Thread Peter Sch�tt
Hallo, 
I have a field with the type TIMESTAMP(6) in an oracle view.

When I want to import it directly to SOLR I get this error message:

WARNING: Error creating document : SolrInputDocument[oid=12, 
last_action_timestamp=oracle.sql.TIMESTAMP@34907781, status=2, ...]
org.apache.solr.common.SolrException: Invalid Date 
String:'oracle.sql.TIMESTAMP@
34907781'
at org.apache.solr.schema.DateField.parseMath(DateField.java:182)
at org.apache.solr.schema.TrieField.createField(TrieField.java:616)
at org.apache.solr.schema.TrieField.createFields
(TrieField.java:655)

What is the best way to import it?


This way works but I do not know if this is the best practise:

In the query:

TO_CHAR(LAST_ACTION_TIMESTAMP, '-MM-DD HH24:MI:SS') as LAT

For the field:

   field column=LAT name=last_action_timestamp 
dateTimeFormat=-MM-dd hh:mm:ss /   

Conversion from timestamp to string to timestamp seems to me not a good 
way. Is there a better way?

Thanks for any hints.

Ciao
  Peter Schütt



Re: java.lang.IllegalArgumentException: No enum const class org.apache.lucene.util.Version.LUCENE_43

2013-05-08 Thread Roald
I thought it reported 4.2.1 because I set luceneMatchVersion to LUCENE_42.

I am using the the 4.3.0 war. Very strange.

I will set up a new virtual machine to make sure there is no way that I am
accidentally using 4.2.1

On Wed, May 8, 2013 at 3:06 PM, Alan Woodward a...@flax.co.uk wrote:

 Hi Roald,

 On the ticket, you report the following version information:
 solr-spec : 4.2.1.2013.03.26.08.26.55
 solr-impl : 4.2.1 1461071 - mark - 2013-03-26 08:26:55
 lucene-spec : 4.2.1
 lucene-impl : 4.2.1 1461071 - mark - 2013-03-26 08:23:34

 This shows that your servlet container is running 4.2.1, not 4.3.  So the
 example solrconfig.xml from 4.3 won't work here.

 Alan Woodward
 www.flax.co.uk


 On 8 May 2013, at 12:52, Roald wrote:

  Hi all,
 
  I just reported this issue:
 http://issues.apache.org/jira/browse/SOLR-4800
 
  java.lang.IllegalArgumentException: No enum const class
  org.apache.lucene.util.Version.LUCENE_43
 
  solr-4.3.0/example/solr/collection1/conf/solrconfig.xml has
  luceneMatchVersionLUCENE_43/luceneMatchVersion
 
  Which causes:
 
  SolrCore Initialization Failures
 
  collection1:
 
 org.apache.solr.common.SolrException:org.apache.solr.common.SolrException:
  Could not load config for solrconfig.xml
 
  From catalina.out :
 
  SEVERE: Unable to create core: collection1
  org.apache.solr.common.SolrException: Could not load config for
  solrconfig.xml
  at
 
 org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:991)
  at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1051)
  at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:634)
  at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:629)
  at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
  at java.util.concurrent.FutureTask.run(FutureTask.java:166)
  at
 java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
  at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
  at java.util.concurrent.FutureTask.run(FutureTask.java:166)
  at
 
 java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
  at
 
 java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
  at java.lang.Thread.run(Thread.java:679)
  Caused by: org.apache.solr.common.SolrException: Invalid
 luceneMatchVersion
  'LUCENE_43', valid values are: [LUCENE_30, LUCENE_31, LUCENE_32,
 LUCENE_33,
  LUCENE_34, LUCENE_35, LUCENE_36, LUCENE_40, LUCENE_41, LUCENE_42,
  LUCENE_CURRENT] or a string in format 'V.V'
  at org.apache.solr.core.Config.parseLuceneVersionString(Config.java:313)
  at org.apache.solr.core.Config.getLuceneVersion(Config.java:298)
  at org.apache.solr.core.SolrConfig.init(SolrConfig.java:119)
  at
 
 org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:989)
  ... 11 more
  Caused by: java.lang.IllegalArgumentException: No enum const class
  org.apache.lucene.util.Version.LUCENE_43
  at java.lang.Enum.valueOf(Enum.java:214)
  at org.apache.lucene.util.Version.valueOf(Version.java:34)
  at org.apache.lucene.util.Version.parseLeniently(Version.java:133)
  at org.apache.solr.core.Config.parseLuceneVersionString(Config.java:311)
  ... 14 more
  May 7, 2013 9:10:00 PM org.apache.solr.common.SolrException log
  SEVERE: null:org.apache.solr.common.SolrException: Unable to create core:
  collection1
  at
 
 org.apache.solr.core.CoreContainer.recordAndThrow(CoreContainer.java:1672)
  at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1057)
  at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:634)
  at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:629)
  at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
  at java.util.concurrent.FutureTask.run(FutureTask.java:166)
  at
 java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
  at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
  at java.util.concurrent.FutureTask.run(FutureTask.java:166)
  at
 
 java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
  at
 
 java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
  at java.lang.Thread.run(Thread.java:679)
  Caused by: org.apache.solr.common.SolrException: Could not load config
 for
  solrconfig.xml
  at
 
 org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:991)
  at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1051)
  ... 10 more
  Caused by: org.apache.solr.common.SolrException: Invalid
 luceneMatchVersion
  'LUCENE_43', valid values are: [LUCENE_30, LUCENE_31, LUCENE_32,
 LUCENE_33,
  LUCENE_34, LUCENE_35, LUCENE_36, LUCENE_40, LUCENE_41, LUCENE_42,
  LUCENE_CURRENT] or a string in format 'V.V'
  at org.apache.solr.core.Config.parseLuceneVersionString(Config.java:313)
  at org.apache.solr.core.Config.getLuceneVersion(Config.java:298)
  at org.apache.solr.core.SolrConfig.init(SolrConfig.java:119)
  at
 
 

Re: Oracle Timestamp in SOLR

2013-05-08 Thread Michael Della Bitta
Peter,

Looks like you can call timestampValue() on that object and get back a
java.sql.Timestamp, which is a subclass of java.util.Date:
http://docs.oracle.com/cd/E16338_01/appdev.112/e13995/oracle/sql/TIMESTAMP.html#timestampValue__

Hope that helps,

Michael Della Bitta


Appinions
18 East 41st Street, 2nd Floor
New York, NY 10017-6271

www.appinions.com

Where Influence Isn’t a Game


On Wed, May 8, 2013 at 9:35 AM, Peter Schütt newsgro...@pstt.de wrote:

 Hallo,
 I have a field with the type TIMESTAMP(6) in an oracle view.

 When I want to import it directly to SOLR I get this error message:

 WARNING: Error creating document : SolrInputDocument[oid=12,
 last_action_timestamp=oracle.sql.TIMESTAMP@34907781, status=2, ...]
 org.apache.solr.common.SolrException: Invalid Date
 String:'oracle.sql.TIMESTAMP@
 34907781'
 at org.apache.solr.schema.DateField.parseMath(DateField.java:182)
 at org.apache.solr.schema.TrieField.createField(TrieField.java:616)
 at org.apache.solr.schema.TrieField.createFields
 (TrieField.java:655)

 What is the best way to import it?


 This way works but I do not know if this is the best practise:

 In the query:

 TO_CHAR(LAST_ACTION_TIMESTAMP, '-MM-DD HH24:MI:SS') as LAT

 For the field:

field column=LAT name=last_action_timestamp
 dateTimeFormat=-MM-dd hh:mm:ss /

 Conversion from timestamp to string to timestamp seems to me not a good
 way. Is there a better way?

 Thanks for any hints.

 Ciao
   Peter Schütt




Facet which takes sum of a field into account for result values

2013-05-08 Thread ld
Within MySQL it is possible to get the Top N results while summing a
particular column in the database.  For example:
SELECT ip_address, SUM(ip_count) AS count FROM table GROUP BY ip_address
ORDER BY count DESC LIMIT 5

This will return the top 5 ip_address based on the sum of ip_count.

Is there a way to have a Facet query within Solr do the same?  In other
words, count an entry as if there were 'ip_count entries', not just one?

I have used the Stats component and faceting but this gives me all the
records, there is no way to limit to the top 10 sums.  My data set may have
millions of records with much variation on IP address so this wouldn’t work.

I have also considered adding ip_count number of entries when writing to
solr but this causes some issues with the unique ID shared with legacy code
that still uses MySQL.

Any help is appreciated.  




--
View this message in context: 
http://lucene.472066.n3.nabble.com/Facet-which-takes-sum-of-a-field-into-account-for-result-values-tp4061588.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Facet which takes sum of a field into account for result values

2013-05-08 Thread Carlos Bonilla
Hi,
have a look at http://wiki.apache.org/solr/TermsComponent.

Regards,
Carlos.


2013/5/8 ld luzange...@gmail.com

 Within MySQL it is possible to get the Top N results while summing a
 particular column in the database.  For example:
 SELECT ip_address, SUM(ip_count) AS count FROM table GROUP BY ip_address
 ORDER BY count DESC LIMIT 5

 This will return the top 5 ip_address based on the sum of ip_count.

 Is there a way to have a Facet query within Solr do the same?  In other
 words, count an entry as if there were 'ip_count entries', not just one?

 I have used the Stats component and faceting but this gives me all the
 records, there is no way to limit to the top 10 sums.  My data set may have
 millions of records with much variation on IP address so this wouldn’t
 work.

 I have also considered adding ip_count number of entries when writing to
 solr but this causes some issues with the unique ID shared with legacy code
 that still uses MySQL.

 Any help is appreciated.




 --
 View this message in context:
 http://lucene.472066.n3.nabble.com/Facet-which-takes-sum-of-a-field-into-account-for-result-values-tp4061588.html
 Sent from the Solr - User mailing list archive at Nabble.com.



java.lang.IllegalArgumentException: No enum const class org.apache.lucene.util.Version.LUCENE_43

2013-05-08 Thread Roald
Hi all,

I just reported this issue: http://issues.apache.org/jira/browse/SOLR-4800

java.lang.IllegalArgumentException: No enum const class
org.apache.lucene.util.Version.LUCENE_43

solr-4.3.0/example/solr/collection1/conf/solrconfig.xml has
luceneMatchVersionLUCENE_43/luceneMatchVersion

Which causes:

SolrCore Initialization Failures

collection1:
org.apache.solr.common.SolrException:org.apache.solr.common.SolrException:
Could not load config for solrconfig.xml

From catalina.out :

SEVERE: Unable to create core: collection1
org.apache.solr.common.SolrException: Could not load config for
solrconfig.xml
at
org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:991)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1051)
at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:634)
at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:629)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:679)
Caused by: org.apache.solr.common.SolrException: Invalid luceneMatchVersion
'LUCENE_43', valid values are: [LUCENE_30, LUCENE_31, LUCENE_32, LUCENE_33,
LUCENE_34, LUCENE_35, LUCENE_36, LUCENE_40, LUCENE_41, LUCENE_42,
LUCENE_CURRENT] or a string in format 'V.V'
at org.apache.solr.core.Config.parseLuceneVersionString(Config.java:313)
at org.apache.solr.core.Config.getLuceneVersion(Config.java:298)
at org.apache.solr.core.SolrConfig.init(SolrConfig.java:119)
at
org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:989)
... 11 more
Caused by: java.lang.IllegalArgumentException: No enum const class
org.apache.lucene.util.Version.LUCENE_43
at java.lang.Enum.valueOf(Enum.java:214)
at org.apache.lucene.util.Version.valueOf(Version.java:34)
at org.apache.lucene.util.Version.parseLeniently(Version.java:133)
at org.apache.solr.core.Config.parseLuceneVersionString(Config.java:311)
... 14 more
May 7, 2013 9:10:00 PM org.apache.solr.common.SolrException log
SEVERE: null:org.apache.solr.common.SolrException: Unable to create core:
collection1
at
org.apache.solr.core.CoreContainer.recordAndThrow(CoreContainer.java:1672)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1057)
at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:634)
at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:629)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:679)
Caused by: org.apache.solr.common.SolrException: Could not load config for
solrconfig.xml
at
org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:991)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1051)
... 10 more
Caused by: org.apache.solr.common.SolrException: Invalid luceneMatchVersion
'LUCENE_43', valid values are: [LUCENE_30, LUCENE_31, LUCENE_32, LUCENE_33,
LUCENE_34, LUCENE_35, LUCENE_36, LUCENE_40, LUCENE_41, LUCENE_42,
LUCENE_CURRENT] or a string in format 'V.V'
at org.apache.solr.core.Config.parseLuceneVersionString(Config.java:313)
at org.apache.solr.core.Config.getLuceneVersion(Config.java:298)
at org.apache.solr.core.SolrConfig.init(SolrConfig.java:119)
at
org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:989)
... 11 more
Caused by: java.lang.IllegalArgumentException: No enum const class
org.apache.lucene.util.Version.LUCENE_43
at java.lang.Enum.valueOf(Enum.java:214)
at org.apache.lucene.util.Version.valueOf(Version.java:34)
at org.apache.lucene.util.Version.parseLeniently(Version.java:133)
at org.apache.solr.core.Config.parseLuceneVersionString(Config.java:311)
... 14 more

If I change LUCENE_43 to LUCENE_42 it works. The admin webpage reports the
following versions:

solr-spec : 4.2.1.2013.03.26.08.26.55
solr-impl : 4.2.1 1461071 - mark - 2013-03-26 08:26:55
lucene-spec : 4.2.1
lucene-impl : 4.2.1 1461071 - mark - 2013-03-26 08:23:34
Thank you very much in advance!

Regards,
Roald


Outstanding Jira issue

2013-05-08 Thread Shane Perry
I opened a Jira issue in Oct of 2011 which is still outstanding. I've
boosted the priority to Critical as each time I've upgraded Solr, I've had
to manually patch and build the jars.   There is a patch (for 3.6) attached
to the ticket. Is there someone with commit access who can take a look and
poke the fix through (preferably on 4.2 as well as 4.3)?  The ticket is
https://issues.apache.org/jira/browse/SOLR-2834.

Thanks in advance.

Shane


Re: transientCacheSize doesn't seem to have any effect, except on startup

2013-05-08 Thread didier deshommes
Any idea on this? I still cannot get the combination of transient cores and
transientCacheSize to work as I think it should: give me the ability to
create a large number cores and automatically load and unload them for me
based on a limit that I set.

If anyone else is using this feature and it is working for you, let me know
how you got it working!


On Fri, May 3, 2013 at 2:11 PM, didier deshommes dfdes...@gmail.com wrote:


 On Fri, May 3, 2013 at 11:18 AM, Erick Erickson 
 erickerick...@gmail.comwrote:

 The cores aren't loaded (or at least shouldn't be) for getting the status.
 The _names_ of the cores should be returned, but those are (supposed) to
 be
 retrieved from a list rather than loaded cores. So are you sure that's
 not what
 you are seeing? How are you determining whether the cores are actually
 loaded
 or not?


 I'm looking at the output of :

 $ curl http://localhost:8983/solr/admin/cores?wt=jsonaction=status;

 cores that are loaded have a startTime and upTime value. Cores that
 are unloaded don't appear in the output at all. For example, I created 3
 transient cores with transientCacheSize=2 . When I asked for a list of
 all cores, all 3 cores were returned. I explicitly unloaded 1 core and got
 back 2 cores when I asked for the list again.

 It would be nice if cores had a isTransient and a isCurrentlyLoaded
 value so that one could see exactly which cores are loaded.




 That said, it's perfectly possible that the status command is doing
 something we
 didn't anticipate, but I took a quick look at the code (got to rush to a
 plane)
 and CoreAdminHandler _appears_ to be just returning whatever info it can
 about an unloaded core for status. I _think_ you'll get more info if the
 core has ever been loaded though, even though if it's been removed from
 the transient cache. Ditto for the create action.

 So let's figure out whether you're really seeing loaded cores or not, and
 then
 raise a JIRA if so...

 Thanks for reporting!
 Erick

 On Thu, May 2, 2013 at 1:27 PM, didier deshommes dfdes...@gmail.com
 wrote:
  Hi,
  I've been very interested in the transient core feature of solr to
 manage a
  large number of cores. I'm especially interested in this use case, that
 the
  wiki lists at http://wiki.apache.org/solr/LotsOfCores (looks to be down
  now):
 
 loadOnStartup=false transient=true: This is really the use-case. There
 are
  a large number of cores in your system that are short-duration use. You
  want Solr to load them as necessary, but unload them when the cache gets
  full on an LRU basis.
 
  I'm creating 10 transient core via core admin like so
 
  $ curl 
 
 http://localhost:8983/solr/admin/cores?wt=jsonaction=CREATEname=new_core2instanceDir=collection1/dataDir=new_core2transient=trueloadOnStartup=false
  
 
  and have transientCacheSize=2 in my solr.xml file, which I take means
 I
  should have at most 2 transient cores loaded at any time. The problem is
  that these cores are still loaded when when I ask solr to list cores:
 
  $ curl http://localhost:8983/solr/admin/cores?wt=jsonaction=status;
 
  From the explanation in the wiki, it looks like solr would manage
 loading
  and unloading transient cores for me without having to worry about them,
  but this is not what's happening.
 
  The situation is different when I restart solr; it does the right
 thing
  by loading the maximum cores set by transientCacheSize. When I add more
  cores, the old behavior happens again, where all created transient cores
  are loaded in solr.
 
  I'm using the development branch lucene_solr_4_3 to run my example. I
 can
  open a jira if need be.





ERROR: incref on a closed log

2013-05-08 Thread yriveiro
Hi all,

I upgrade my solrcluster today from 4.2.1 to 4.3. On startup I can see some
error like this:

2449515 [catalina-exec-51] ERROR org.apache.solr.core.SolrCore  –
org.apache.solr.common.SolrException: incref on a closed log:
tlog{file=/opt/node02.solrcloud/solr/home/XXX/data/tlog/tlog.000
refcount=1}
at org.apache.solr.update.TransactionLog.incref(TransactionLog.java:492)
at org.apache.solr.update.UpdateLog.getRecentUpdates(UpdateLog.java:998)
at
org.apache.solr.handler.component.RealTimeGetComponent.processGetVersions(RealTimeGetComponent.java:515)
at
org.apache.solr.handler.component.RealTimeGetComponent.process(RealTimeGetComponent.java:92)
at
org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:208)
at
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:135)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:1816)
at
org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:656)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:359)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:155)
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243)
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:222)
at
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:123)
at
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
at
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:99)
at
org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:947)
at
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)
at
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
at
org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1009)
at
org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:589)
at
org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:310)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)

Anyone know what could be happening?



-
Best regards
--
View this message in context: 
http://lucene.472066.n3.nabble.com/ERROR-incref-on-a-closed-log-tp4061609.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Tokenize Sentence and Set Attribute

2013-05-08 Thread Edward Garrett
i find UpdateRequestProcessors (
http://wiki.apache.org/solr/UpdateRequestProcessor) a handy way to add and
remove NLP-related fields to a document as it is processed by Solr. this is
also how UIMA integrates with Solr (http://wiki.apache.org/solr/SolrUIMA).
you might want to take a look at UIMA as well.


On Mon, May 6, 2013 at 6:22 PM, Jack Krupansky j...@basetechnology.comwrote:

 Sounds like a very ambitious project. I'm sure you COULD do it in Solr,
 but not in very short order.

 Check out some discussion of simply searching within sentences:
 http://markmail.org/message/**aoiq62a4mlo25zzk?q=apache#**
 query:apache+page:1+mid:**aoiq62a4mlo25zzk+state:resultshttp://markmail.org/message/aoiq62a4mlo25zzk?q=apache#query:apache+page:1+mid:aoiq62a4mlo25zzk+state:results

 First, how do you expect to use/query the corpus?  In other words, what
 are your user requirements? They will determine what structure the Solr
 index, analysis chains, and custom search components will need.

 Also, check out the Solr OpenNLP wiki:
 http://wiki.apache.org/solr/**OpenNLPhttp://wiki.apache.org/solr/OpenNLP

 And see LUCENE-2899: Add OpenNLP Analysis capabilities as a module:
 https://issues.apache.org/**jira/browse/LUCENE-2899https://issues.apache.org/jira/browse/LUCENE-2899

 -- Jack Krupansky

 -Original Message- From: Rendy Bambang Junior
 Sent: Monday, May 06, 2013 11:41 AM
 To: solr-user@lucene.apache.org
 Subject: Tokenize Sentence and Set Attribute


 Hello,

 I am trying to use part of speech tagger for bahasa Indonesia to filter
 tokens in Solr.
 The tagger receive input as word list of a sentence and return tag array.

 I think the process should by like this:
 - tokenize sentence
 - tokenize word
 - pass it into the tagger
 - set attribute using tagger output
 - pass it into a FilteringTokenFilter implementation

 Is it possible to do this in Solr/Lucene? If it is, how?

 I've read similar solution for Japanese language but since I am lack of
 Japanese understanding, it couldn't help a lot.

 --
 Regards,
 Rendy Bambang Junior
 Informatics Engineering '09
 Bandung Institute of Technology




-- 
edge


Re: java.lang.IllegalArgumentException: No enum const class org.apache.lucene.util.Version.LUCENE_43

2013-05-08 Thread Roald
I solved it by setting up a new virtual machine. Apparantly tomcat was
still using 4.2.1 somehow.

Thanks!

On Wed, May 8, 2013 at 3:40 PM, Roald depja...@gmail.com wrote:

 I thought it reported 4.2.1 because I set luceneMatchVersion to LUCENE_42.

 I am using the the 4.3.0 war. Very strange.

 I will set up a new virtual machine to make sure there is no way that I am
 accidentally using 4.2.1


 On Wed, May 8, 2013 at 3:06 PM, Alan Woodward a...@flax.co.uk wrote:

 Hi Roald,

 On the ticket, you report the following version information:
 solr-spec : 4.2.1.2013.03.26.08.26.55
 solr-impl : 4.2.1 1461071 - mark - 2013-03-26 08:26:55
 lucene-spec : 4.2.1
 lucene-impl : 4.2.1 1461071 - mark - 2013-03-26 08:23:34

 This shows that your servlet container is running 4.2.1, not 4.3.  So the
 example solrconfig.xml from 4.3 won't work here.

 Alan Woodward
 www.flax.co.uk


 On 8 May 2013, at 12:52, Roald wrote:

  Hi all,
 
  I just reported this issue:
 http://issues.apache.org/jira/browse/SOLR-4800
 
  java.lang.IllegalArgumentException: No enum const class
  org.apache.lucene.util.Version.LUCENE_43
 
  solr-4.3.0/example/solr/collection1/conf/solrconfig.xml has
  luceneMatchVersionLUCENE_43/luceneMatchVersion
 
  Which causes:
 
  SolrCore Initialization Failures
 
  collection1:
 
 org.apache.solr.common.SolrException:org.apache.solr.common.SolrException:
  Could not load config for solrconfig.xml
 
  From catalina.out :
 
  SEVERE: Unable to create core: collection1
  org.apache.solr.common.SolrException: Could not load config for
  solrconfig.xml
  at
 
 org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:991)
  at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1051)
  at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:634)
  at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:629)
  at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
  at java.util.concurrent.FutureTask.run(FutureTask.java:166)
  at
 java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
  at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
  at java.util.concurrent.FutureTask.run(FutureTask.java:166)
  at
 
 java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
  at
 
 java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
  at java.lang.Thread.run(Thread.java:679)
  Caused by: org.apache.solr.common.SolrException: Invalid
 luceneMatchVersion
  'LUCENE_43', valid values are: [LUCENE_30, LUCENE_31, LUCENE_32,
 LUCENE_33,
  LUCENE_34, LUCENE_35, LUCENE_36, LUCENE_40, LUCENE_41, LUCENE_42,
  LUCENE_CURRENT] or a string in format 'V.V'
  at org.apache.solr.core.Config.parseLuceneVersionString(Config.java:313)
  at org.apache.solr.core.Config.getLuceneVersion(Config.java:298)
  at org.apache.solr.core.SolrConfig.init(SolrConfig.java:119)
  at
 
 org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:989)
  ... 11 more
  Caused by: java.lang.IllegalArgumentException: No enum const class
  org.apache.lucene.util.Version.LUCENE_43
  at java.lang.Enum.valueOf(Enum.java:214)
  at org.apache.lucene.util.Version.valueOf(Version.java:34)
  at org.apache.lucene.util.Version.parseLeniently(Version.java:133)
  at org.apache.solr.core.Config.parseLuceneVersionString(Config.java:311)
  ... 14 more
  May 7, 2013 9:10:00 PM org.apache.solr.common.SolrException log
  SEVERE: null:org.apache.solr.common.SolrException: Unable to create
 core:
  collection1
  at
 
 org.apache.solr.core.CoreContainer.recordAndThrow(CoreContainer.java:1672)
  at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1057)
  at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:634)
  at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:629)
  at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
  at java.util.concurrent.FutureTask.run(FutureTask.java:166)
  at
 java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
  at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
  at java.util.concurrent.FutureTask.run(FutureTask.java:166)
  at
 
 java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
  at
 
 java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
  at java.lang.Thread.run(Thread.java:679)
  Caused by: org.apache.solr.common.SolrException: Could not load config
 for
  solrconfig.xml
  at
 
 org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:991)
  at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1051)
  ... 10 more
  Caused by: org.apache.solr.common.SolrException: Invalid
 luceneMatchVersion
  'LUCENE_43', valid values are: [LUCENE_30, LUCENE_31, LUCENE_32,
 LUCENE_33,
  LUCENE_34, LUCENE_35, LUCENE_36, LUCENE_40, LUCENE_41, LUCENE_42,
  LUCENE_CURRENT] or a string in format 'V.V'
  at 

RE: numFound changes on changing start and rows

2013-05-08 Thread Jie Sun
any update on this?

will this be addressed/fixed? 

in our system, our UI will allow user to paginate through search results. 

As my in deep test find out, if the rows=0, the results size is consistently
the total sum of the documents on all shards regardless there is any
duplicates; if the rows is a number larger than the supposedly returned the
merge document number, the result numFound is accurate and consistent,
however, if the rows is with a number smaller than the supposedly merge
results size, it will be non-deterministic.

unfortunately, in our system, it is not easy to work around this problem. we
have to issue and query whenever use click on Next button, and the rows is
20 in our case and in most of the cases it is smaller than the merged
results size, so we get a different number each time.

If we do rows=0 up in front, it wont work either, since we want the accurate
number and others may have indexed new documents at the same time.
Especially when user hit the last page, sometimes we see the numFound off by
hundreds, this wont work.

please advice.
thanks
Jie



--
View this message in context: 
http://lucene.472066.n3.nabble.com/numFound-changes-on-changing-start-and-rows-tp3999752p4061628.html
Sent from the Solr - User mailing list archive at Nabble.com.


RE: numFound changes on changing start and rows

2013-05-08 Thread Jie Sun
ok when my head is cooled down, I remember this old school issue... that I
have been dealing with it myself.

so I do not expect this can be straighten out or fixed in anyways.

basically when you have to sorted results sets you need to merge, and
paginate through, it is never an easy job (if all is possible) to figure out
what is exactly the number if you only require a portion of the results
being returned.

for example if 1 set has 40,000 rows returned, the other set has 50,000
returned, and you want the start=440 and rows=20 (paginate on UI), the
typical algorithm will be sort both sets and return the near portion of both
sets, toss away the duplicates in that range (20 rows), so even you
calcualte with the duplicates prior to that start point, you have no way to
tell how many duplicates after that point, so you really do not know for
fact the exact / accurate numFound, unless you require return the whole
thing. and that is why when I give a huge rows number, it will give me the
accurate count each time. However, solr shard query will throw 500 server
error if the returned set is around 50k, which is reasonable.

So find work around in the context is the only solution. Check with google
search pattern, may get some fuzzy idea :-)

thanks
jie 



--
View this message in context: 
http://lucene.472066.n3.nabble.com/numFound-changes-on-changing-start-and-rows-tp3999752p4061633.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Facet which takes sum of a field into account for result values

2013-05-08 Thread ld
Unfortunately, terms do not help solve my issue.

To elaborate - say i have 5 entries:
uuid - ipaddress - ipcount
1   1.1.1.1   80
2   2.2.2.2   1
3   3.3.3.3   20
4   3.3.3.3   20

When i run a facet query on the ipaddress, i get the following results:

http://localhost:8983/solr/alerts/select?q=*:*facet=truefacet.mincount=1facet.limit=10facet.field=ipaddress

lst name=facet_fields
lst name=ipaddress
int name=3.3.3.32/int
int name=1.1.1.11/int
int name=2.2.2.21/int
/lst
/lst

BUT what i would like is to force the facet query to use the ipcount as the
sum, like this:

lst name=facet_fields
lst name=ipaddress
int name=3.3.3.340/int
int name=1.1.1.180/int
int name=2.2.2.21/int
/lst
/lst

Using the stats component with faceting gives me what i want but due to the
fact that i cannot limit this, i worry processing the data after the query
will take a long time.

Thanks





--
View this message in context: 
http://lucene.472066.n3.nabble.com/Facet-which-takes-sum-of-a-field-into-account-for-result-values-tp4061588p4061636.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Query Elevation exception on shard queries

2013-05-08 Thread varun srivastava
Ok found the solution .. Like SpellcheckComponent , Elevate Component also
requires shards.qt param .. But still dont know why both these components
doesn't work in absense of shards.qt . Can anyone explain ?

Thanks
Varun


On Mon, May 6, 2013 at 1:14 PM, varun srivastava varunmail...@gmail.comwrote:

 Thanks Ravi. So then it is a bug .


 On Mon, May 6, 2013 at 12:04 PM, Ravi Solr ravis...@gmail.com wrote:

 Varun,
  Since our cores were totally disjoint i.e. they pertain to two
 different applications which may or may not have results for a given
 query,
 we moved the elavation outside of solr into our java code. As long as both
 cores had some results to return for a given query elevation would work.

 Thanks,

 Ravi


 On Sat, May 4, 2013 at 1:54 PM, varun srivastava varunmail...@gmail.com
 wrote:

  Hi Ravi,
   I am getting same probelm . You got any solution ?
 
  Thanks
  Varun
 
 
  On Fri, Mar 29, 2013 at 11:48 AM, Ravi Solr ravis...@gmail.com wrote:
 
   Hello,
 We have a Solr 3.6.2 multicore setup, where each core is a
 complete
   index for one application. In our site search we use sharded query to
  query
   two cores at a time. The issue is, If one core has docs but other core
   doesn't for an elevated query solr is throwing a 500 error. I woudl
  really
   appreciate it if somebody can point me in the right direction on how
 to
   avoid this error, the following is my query
  
  
  
 
 [#|2013-03-29T13:44:55.609-0400|INFO|sun-appserver2.1|org.apache.solr.core.SolrCore|_ThreadID=22;_ThreadName=httpSSLWorkerThread-9001-0;|[core1]
   webapp=/solr path=/select/
  
  
 
 params={q=civil+warstart=0rows=10shards=localhost:/solr/core1,localhost:/solr/core2hl=truehl.fragsize=0hl.snippets=5hl.simple.pre=stronghl.simple.post=/stronghl.fl=bodyfl=*facet=truefacet.field=typefacet.mincount=1facet.method=enumfq=pubdate:[2005-01-01T00:00:00Z+TO+NOW/DAY%2B1DAY]facet.query={!ex%3Ddt+key%3DPast+24+Hours}pubdate:[NOW/DAY-1DAY+TO+NOW/DAY%2B1DAY]facet.query={!ex%3Ddt+key%3DPast+7+Days}pubdate:[NOW/DAY-7DAYS+TO+NOW/DAY%2B1DAY]facet.query={!ex%3Ddt+key%3DPast+60+Days}pubdate:[NOW/DAY-60DAYS+TO+NOW/DAY%2B1DAY]facet.query={!ex%3Ddt+key%3DPast+12+Months}pubdate:[NOW/DAY-1YEAR+TO+NOW/DAY%2B1DAY]facet.query={!ex%3Ddt+key%3DAll+Since+2005}pubdate:[*+TO+NOW/DAY%2B1DAY]}
   status=500 QTime=15 |#]
  
  
   As you can see the 2 cores are core1 and core2. The core1 has data
 for he
   query 'civil war' however core2 doesn't have any data. We have the
 'civil
   war' in the elevate.xml which causes Solr to throw a SolrException as
   follows. However if I remove the elevate entry for this query,
 everything
   works well.
  
   *type* Status report
  
   *message*Index: 1, Size: 0 java.lang.IndexOutOfBoundsException:
 Index: 1,
   Size: 0 at java.util.ArrayList.RangeCheck(ArrayList.java:547) at
   java.util.ArrayList.get(ArrayList.java:322) at
   org.apache.solr.common.util.NamedList.getVal(NamedList.java:137) at
  
  
 
 org.apache.solr.handler.component.ShardFieldSortedHitQueue$ShardComparator.sortVal(ShardDoc.java:221)
   at
  
  
 
 org.apache.solr.handler.component.ShardFieldSortedHitQueue$2.compare(ShardDoc.java:260)
   at
  
  
 
 org.apache.solr.handler.component.ShardFieldSortedHitQueue.lessThan(ShardDoc.java:160)
   at
  
  
 
 org.apache.solr.handler.component.ShardFieldSortedHitQueue.lessThan(ShardDoc.java:101)
   at
 org.apache.lucene.util.PriorityQueue.upHeap(PriorityQueue.java:223) at
   org.apache.lucene.util.PriorityQueue.add(PriorityQueue.java:132) at
  
  
 
 org.apache.lucene.util.PriorityQueue.insertWithOverflow(PriorityQueue.java:148)
   at
  
  
 
 org.apache.solr.handler.component.QueryComponent.mergeIds(QueryComponent.java:786)
   at
  
  
 
 org.apache.solr.handler.component.QueryComponent.handleRegularResponses(QueryComponent.java:587)
   at
  
  
 
 org.apache.solr.handler.component.QueryComponent.handleResponses(QueryComponent.java:566)
   at
  
  
 
 org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:283)
   at
  
  
 
 org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:129)
   at org.apache.solr.core.SolrCore.execute(SolrCore.java:1376) at
  
  
 
 org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:365)
   at
  
  
 
 org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:260)
   at
  
  
 
 org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:246)
   at
  
  
 
 org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:214)
   at
  
  
 
 org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:313)
   at
  
  
 
 org.apache.catalina.core.StandardContextValve.invokeInternal(StandardContextValve.java:287)
   at
  
  
 
 org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:218)
   at
  
  
 
 

Re: Elevate Problem with Distributed query

2013-05-08 Thread varun srivastava
Ok found the solution .. Like SpellcheckComponent , Elevate Component also
requires shards.qt param .. But still dont know why both these components
doesn't work in absense of shards.qt . Can anyone explain ?

Thanks


On Sat, May 4, 2013 at 1:08 PM, varun srivastava varunmail...@gmail.comwrote:


 i am getting following exception when sort fieldname is  _elevate_ .


 ava.lang.IndexOutOfBoundsException: Index: 1, Size: 0\n\tat
 java.util.ArrayList.RangeCheck(ArrayList.java:547)\n\tat
 java.util.ArrayList.get(ArrayList.java:322)\n\tat

 org.apache.solr.common.util.NamedList.getVal(NamedList.java:136)\n\tat
 org.apache.solr.handler.component.ShardFieldSortedHitQueue$ShardComparator.sortVal(ShardDoc.java:217)\n\tat


 org.apache.solr.handler.component.ShardFieldSortedHitQueue$2.compare(ShardDoc.java:255)\n\tat

 org.apache.solr.handler.component.ShardFieldSortedHitQueue.lessThan(ShardDoc.java:159)\n\tat
 org.apache.solr.handler.component.ShardFieldSortedHitQueue.lessThan(ShardDoc.java:101)\n\tat

 org.apache.lucene.util.PriorityQueue.insertWithOverflow(PriorityQueue.java:158)\n\tat
 org.apache.solr.handler.component.QueryComponent.mergeIds(QueryComponent.java:863)\n\tat


 org.apache.solr.handler.component.QueryComponent.handleRegularResponses(QueryComponent.java:626)\n\tat


 org.apache.solr.handler.component.QueryComponent.handleResponses(QueryComponent.java:605)\n\tat


 org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:309)\n\tat

 org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:129)\n\tat
 org.apache.solr.core.SolrCore.execute(SolrCore.java:1699)\n\tat

 org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:455)\n\tat
 org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:276)\n\tat

 org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243)\n\tat
 org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)\n\tat

 org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:225)\n\tat
 org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:123)\n\tat

 org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168)\n\tat
 org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98)\n\tat

 org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:927)\n\tat
 org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)\n\tat

 org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:407)\n\tat
 org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1001)\n\tat

 org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:585)\n\tat
 org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:310)\n\tat

 java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)\n\tat
 java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)\n\tat

 java.lang.Thread.run(Thread.java:662)



 On Sat, May 4, 2013 at 11:10 AM, varun srivastava 
 varunmail...@gmail.comwrote:

 Hi,
  Is Query Elevate featue is suppose to work with distributed query ? I
 have 2 shards but when I am doing distributed query I get following
 Exception. I am using solr 4.0.0


 in following bug yonik is refering to problem in his comment

 https://issues.apache.org/jira/browse/SOLR-2949?focusedCommentId=13232736page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-13232736

 But it seems bug is fixed in 4.0 then why i am getting following
 exception with _elevate_ fieldname

 ava.lang.IndexOutOfBoundsException: Index: 1, Size: 0\n\tat
 java.util.ArrayList.RangeCheck(ArrayList.java:547)\n\tat
 java.util.ArrayList.get(ArrayList.java:322)\n\tat
 org.apache.solr.common.util.NamedList.getVal(NamedList.java:136)\n\tat
 org.apache.solr.handler.component.ShardFieldSortedHitQueue$ShardComparator.sortVal(ShardDoc.java:217)\n\tat
 org.apache.solr.handler.component.ShardFieldSortedHitQueue$2.compare(ShardDoc.java:255)\n\tat
 org.apache.solr.handler.component.ShardFieldSortedHitQueue.lessThan(ShardDoc.java:159)\n\tat
 org.apache.solr.handler.component.ShardFieldSortedHitQueue.lessThan(ShardDoc.java:101)\n\tat
 org.apache.lucene.util.PriorityQueue.insertWithOverflow(PriorityQueue.java:158)\n\tat
 org.apache.solr.handler.component.QueryComponent.mergeIds(QueryComponent.java:863)\n\tat
 org.apache.solr.handler.component.QueryComponent.handleRegularResponses(QueryComponent.java:626)\n\tat
 org.apache.solr.handler.component.QueryComponent.handleResponses(QueryComponent.java:605)\n\tat
 org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:309)\n\tat
 org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:129)\n\tat
 

Re: Outstanding Jira issue

2013-05-08 Thread Shawn Heisey

On 5/8/2013 9:20 AM, Shane Perry wrote:

I opened a Jira issue in Oct of 2011 which is still outstanding. I've
boosted the priority to Critical as each time I've upgraded Solr, I've had
to manually patch and build the jars.   There is a patch (for 3.6) attached
to the ticket. Is there someone with commit access who can take a look and
poke the fix through (preferably on 4.2 as well as 4.3)?  The ticket is
https://issues.apache.org/jira/browse/SOLR-2834.


Your patch just ignores the problem so the request doesn't crash, it 
doesn't fix it.  We need to fix whatever the problem is in 
HTMLStripCharFilter.


I had hoped I could come up with a quick fix, but it's proving too 
difficult for me to unravel.  I can't even figure out it works on good 
analysis components like WhiteSpaceTokenizer, so I definitely can't see 
what the problem is for HTMLStripCharFilter.


Thanks,
Shawn



Solr 4.3 fails in startup when dataimporthandler declaration is included in solrconfig.xml

2013-05-08 Thread William Pierce
Hi, 

I have gotten solr 4.3 up and running on tomcat7/windows7.  I have added the 
two dataimport handler jars (found in the dist folder of my solr 4.3 download) 
to the tomcat/lib folder (where I also placed the solr.war).   

Then I added the following line to my solrconfig.xml:

requestHandler name=/dataimport 
class=org.apache.solr.handler.dataimport.DataImportHandler
lst name=defaults
  str name=configdih-config.xml/str
/lst
/requestHandler

When I start tomcat, I get the stack trace shown below (commenting out the 
above lines causes tomcat  solr to start up just fine).  

ERROR - 2013-05-08 10:43:48.185; org.apache.solr.core.CoreContainer; Unable to 
create core: collection1
org.apache.solr.common.SolrException: org/apache/solr/util/plugin/SolrCoreAware
at org.apache.solr.core.SolrCore.init(SolrCore.java:821)
at org.apache.solr.core.SolrCore.init(SolrCore.java:618)
at 
org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:949)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:984)
at org.apache.solr.core.CoreContainer$2.call(CoreContainer.java:597)
at org.apache.solr.core.CoreContainer$2.call(CoreContainer.java:592)
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Caused by: java.lang.NoClassDefFoundError: 
org/apache/solr/util/plugin/SolrCoreAware
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(Unknown Source)
at java.security.SecureClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.access$100(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Unknown Source)
at 
org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1700)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.net.FactoryURLClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.net.FactoryURLClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Unknown Source)
at 
org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:448)
at 
org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:396)
at org.apache.solr.core.SolrCore.createInstance(SolrCore.java:518)
at org.apache.solr.core.SolrCore.createRequestHandler(SolrCore.java:592)
at 
org.apache.solr.core.RequestHandlers.initHandlersFromConfig(RequestHandlers.java:154)
at org.apache.solr.core.SolrCore.init(SolrCore.java:758)
... 13 more
Caused by: java.lang.ClassNotFoundException: 
org.apache.solr.util.plugin.SolrCoreAware
at java.net.URLClassLoader$1.run(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
... 40 more
ERROR - 2013-05-08 10:43:48.189; org.apache.solr.common.SolrException; 
null:org.apache.solr.common.SolrException: Unable to create core: collection1
at 
org.apache.solr.core.CoreContainer.recordAndThrow(CoreContainer.java:1450)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:993)
at org.apache.solr.core.CoreContainer$2.call(CoreContainer.java:597)
at org.apache.solr.core.CoreContainer$2.call(CoreContainer.java:592)
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Caused by: org.apache.solr.common.SolrException: 

Re: Outstanding Jira issue

2013-05-08 Thread Shane Perry
Yeah, I realize my fix is more a bandage.  While it wouldn't be a good
long-term solution, how about going the path of ignoring unrecognized types
and logging a warning message so the handler does crash.  The Jira ticket
could then be left open (and hopefully assigned) to fix the actual problem.
 This would allow consumers from having to avoid the scenario or manually
patching the file to ignore the problem.

On Wed, May 8, 2013 at 11:49 AM, Shawn Heisey s...@elyograg.org wrote:

 On 5/8/2013 9:20 AM, Shane Perry wrote:

 I opened a Jira issue in Oct of 2011 which is still outstanding. I've
 boosted the priority to Critical as each time I've upgraded Solr, I've had
 to manually patch and build the jars.   There is a patch (for 3.6)
 attached
 to the ticket. Is there someone with commit access who can take a look and
 poke the fix through (preferably on 4.2 as well as 4.3)?  The ticket is
 https://issues.apache.org/**jira/browse/SOLR-2834https://issues.apache.org/jira/browse/SOLR-2834
 .


 Your patch just ignores the problem so the request doesn't crash, it
 doesn't fix it.  We need to fix whatever the problem is in
 HTMLStripCharFilter.

 I had hoped I could come up with a quick fix, but it's proving too
 difficult for me to unravel.  I can't even figure out it works on good
 analysis components like WhiteSpaceTokenizer, so I definitely can't see
 what the problem is for HTMLStripCharFilter.

 Thanks,
 Shawn




Re: Solr 4.3 fails in startup when dataimporthandler declaration is included in solrconfig.xml

2013-05-08 Thread Alexandre Rafalovitch
Could be classloader issue. E.g. the jars in tomcat/lib not visible to
whatever is trying to load DIH. Have you tried putting those jars
somewhere else and using lib directive in solrconfig.xml instead to
point to them?

Regards,
   Alex.
On Wed, May 8, 2013 at 2:07 PM, William Pierce evalsi...@hotmail.com wrote:
 I have gotten solr 4.3 up and running on tomcat7/windows7.  I have added the 
 two dataimport handler jars (found in the dist folder of my solr 4.3 
 download) to the tomcat/lib folder (where I also placed the solr.war).

 Then I added the following line to my solrconfig.xml:

 requestHandler name=/dataimport 
 class=org.apache.solr.handler.dataimport.DataImportHandler
 lst name=defaults
   str name=configdih-config.xml/str
 /lst
 /requestHandler

 When I start tomcat, I get the stack trace shown below (commenting out the 
 above lines causes tomcat  solr to start up just fine).



Personal blog: http://blog.outerthoughts.com/
LinkedIn: http://www.linkedin.com/in/alexandrerafalovitch
- Time is the quality of nature that keeps events from happening all
at once. Lately, it doesn't seem to be working.  (Anonymous  - via GTD
book)


Re: Solr 4.3 fails in startup when dataimporthandler declaration is included in solrconfig.xml

2013-05-08 Thread William Pierce
Thanks, Alex.  I have tried placing the jars in a folder under solrhome/lib 
or under the instanceDir/lib with appropriate declarations in the 
solrconfig.xml.  I can see the jars being loaded in the logs.  But neither 
configuration seems to work.


Bill

-Original Message- 
From: Alexandre Rafalovitch

Sent: Wednesday, May 08, 2013 11:12 AM
To: solr-user@lucene.apache.org
Subject: Re: Solr 4.3 fails in startup when dataimporthandler declaration is 
included in solrconfig.xml


Could be classloader issue. E.g. the jars in tomcat/lib not visible to
whatever is trying to load DIH. Have you tried putting those jars
somewhere else and using lib directive in solrconfig.xml instead to
point to them?

Regards,
  Alex.
On Wed, May 8, 2013 at 2:07 PM, William Pierce evalsi...@hotmail.com 
wrote:
I have gotten solr 4.3 up and running on tomcat7/windows7.  I have added 
the two dataimport handler jars (found in the dist folder of my solr 4.3 
download) to the tomcat/lib folder (where I also placed the solr.war).


Then I added the following line to my solrconfig.xml:

requestHandler name=/dataimport 
class=org.apache.solr.handler.dataimport.DataImportHandler

lst name=defaults
  str name=configdih-config.xml/str
/lst
/requestHandler

When I start tomcat, I get the stack trace shown below (commenting out the 
above lines causes tomcat  solr to start up just fine).




Personal blog: http://blog.outerthoughts.com/
LinkedIn: http://www.linkedin.com/in/alexandrerafalovitch
- Time is the quality of nature that keeps events from happening all
at once. Lately, it doesn't seem to be working.  (Anonymous  - via GTD
book) 



Re: Query using function query result

2013-05-08 Thread Chris Hostetter

: i want to query documents which match a certain dynamic criteria.
: like, How do i get all documents, where sub(field1,field2)  0 ?
: 
: i tried _val_: sub(field1,field2) and used fq:[_val_:[0 TO *]

take a look at the frange QParser...

https://lucene.apache.org/solr/4_3_0/solr-core/org/apache/solr/search/FunctionRangeQParserPlugin.html

  fq={!frange l=0}sub(field1,field2)

I've updated the wiki to draw more attention to this usage...

https://wiki.apache.org/solr/FunctionQuery#Using_FunctionQuery


-Hoss


Re: Solr 4.3 fails in startup when dataimporthandler declaration is included in solrconfig.xml

2013-05-08 Thread Alexandre Rafalovitch
I'd say it is still a CLASSPATH issue. Quick Google shows long history
of complaints (all about Tomcat):
http://www.manning-sandbox.com/thread.jspa?threadID=51061
Personal blog: http://blog.outerthoughts.com/
LinkedIn: http://www.linkedin.com/in/alexandrerafalovitch
- Time is the quality of nature that keeps events from happening all
at once. Lately, it doesn't seem to be working.  (Anonymous  - via GTD
book)


On Wed, May 8, 2013 at 3:15 PM, William Pierce evalsi...@hotmail.com wrote:
 Thanks, Alex.  I have tried placing the jars in a folder under solrhome/lib
 or under the instanceDir/lib with appropriate declarations in the
 solrconfig.xml.  I can see the jars being loaded in the logs.  But neither
 configuration seems to work.

 Bill

 -Original Message- From: Alexandre Rafalovitch
 Sent: Wednesday, May 08, 2013 11:12 AM
 To: solr-user@lucene.apache.org
 Subject: Re: Solr 4.3 fails in startup when dataimporthandler declaration is
 included in solrconfig.xml


 Could be classloader issue. E.g. the jars in tomcat/lib not visible to
 whatever is trying to load DIH. Have you tried putting those jars
 somewhere else and using lib directive in solrconfig.xml instead to
 point to them?

 Regards,
   Alex.
 On Wed, May 8, 2013 at 2:07 PM, William Pierce evalsi...@hotmail.com
 wrote:

 I have gotten solr 4.3 up and running on tomcat7/windows7.  I have added
 the two dataimport handler jars (found in the dist folder of my solr 4.3
 download) to the tomcat/lib folder (where I also placed the solr.war).

 Then I added the following line to my solrconfig.xml:

 requestHandler name=/dataimport
 class=org.apache.solr.handler.dataimport.DataImportHandler
 lst name=defaults
   str name=configdih-config.xml/str
 /lst
 /requestHandler

 When I start tomcat, I get the stack trace shown below (commenting out the
 above lines causes tomcat  solr to start up just fine).




 Personal blog: http://blog.outerthoughts.com/
 LinkedIn: http://www.linkedin.com/in/alexandrerafalovitch
 - Time is the quality of nature that keeps events from happening all
 at once. Lately, it doesn't seem to be working.  (Anonymous  - via GTD
 book)


Re: solr 4.2.1 and docValues

2013-05-08 Thread Chris Hostetter

: Questions:

: - what is the advantage of having indexed=true and docvalues=true?

indexed=true and docValues=true are orthoginal.  it might make sense 
to use both if you wnated to do term queries on the field but also 
faceting -- because indexed tems are generally faster for queries, but 
docvalues may be faster for faceting if you index is changing rapidly.

: - what if default= also for the popularity int field?

that would not be legal since  is not a valid int value.

http://wiki.apache.org/solr/DocValues


-Hoss


Re: Numeric fields and payload

2013-05-08 Thread Chris Hostetter

: is it possible to store (text) payload to numeric fields (class 
: solr.TrieDoubleField)?  My goal is to store measure units to numeric 
: features - e.g. '1.5 cm' - and to use faceted search with these fields. 
: But the field type doesn't allow analyzers to add the payload data. I 
: want to avoid database access to load the units. I'm using Solr 4.2 .

I'm not sure if it's possible to add payloads to Trie fields, but even if 
there is i don't think you really want that for your usecase -- i think it 
would make a lot more sense to normalize your units so you do consistent 
sorting, range queries, and faceting on the values regardless of wether 
it's 100cm or 1000mm or 1m.


-Hoss


spellcheker and exact match

2013-05-08 Thread hacene
I have created an index that contains pizza hut and when I misspell it
pizza hot the spellchecker doesn't return anything. The strange thing is
it does find pizza hut when it is mispelled to pizza hit
What is the logic behind this behaviour? any help
thank you



--
View this message in context: 
http://lucene.472066.n3.nabble.com/spellcheker-and-exact-match-tp4061672.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: spellcheck

2013-05-08 Thread hacene
try to remove those in the configuration




--
View this message in context: 
http://lucene.472066.n3.nabble.com/spellcheck-tp506116p4061675.html
Sent from the Solr - User mailing list archive at Nabble.com.


RE: spellcheker and exact match

2013-05-08 Thread Dyer, James
Try setting spellcheck.alternativeTermCount to a nonzero value.  See 
http://wiki.apache.org/solr/SpellCheckComponent#spellcheck.alternativeTermCount

The issue may be that by default, the spellchecker will never try to offer 
suggestions for a term that exists in the dictionary.  So if some other 
document contains hot, it won't try to suggest for it.  On the other hand, 
having no documents with hit results in suggestions.  
spellcheck.alternativeTermCount tells it to offer suggestions even if the 
term is in the dictionary.

James Dyer
Ingram Content Group
(615) 213-4311

-Original Message-
From: hacene [mailto:hacene.meche...@ypg.com] 
Sent: Wednesday, May 08, 2013 2:12 PM
To: solr-user@lucene.apache.org
Subject: spellcheker and exact match

I have created an index that contains pizza hut and when I misspell it
pizza hot the spellchecker doesn't return anything. The strange thing is
it does find pizza hut when it is mispelled to pizza hit
What is the logic behind this behaviour? any help
thank you



--
View this message in context: 
http://lucene.472066.n3.nabble.com/spellcheker-and-exact-match-tp4061672.html
Sent from the Solr - User mailing list archive at Nabble.com.




Re: Indexing 4 different cores same machine

2013-05-08 Thread Otis Gospodnetic
Hi,

Right, the network could be something else - memory of network, for
instance.  What are you using to index?  Make sure you're hitting Solr
with multiple threads if your CPU is multi-core.  Use SPM for Solr or
anything else and share some Solr monitoring graphs if you think they
can help.  And/or share some of your indexing code.

Otis
--
Solr  ElasticSearch Support
http://sematext.com/





On Wed, May 8, 2013 at 10:12 AM, marotosg marot...@gmail.com wrote:
 Hi,

 I have 4 different cores in same machine.
 Person core - 3 million docs   - 20 GB size
 Company Core  - 1 million docs - 2GB size
 Documents Core - 5 million docs - 5GB size
 Emails Core - 50,000 thousand  - 200 Mb

 While I am indexing data performance in server is almost the same if I am
 indexing only one core or all
 cores at the same time.

 I thought having different cores allow you to get different threads in
 parallel gaining some performance.
 Am I right?. My server is never reaching 100% CPU use. It always about 50%
 or even less.
 I had a look to I/O and it is not a problem.

 Any ideas?

 Thanks
 Sergio





 --
 View this message in context: 
 http://lucene.472066.n3.nabble.com/Indexing-4-different-cores-same-machine-tp4061576.html
 Sent from the Solr - User mailing list archive at Nabble.com.


atomic updates w/ double field

2013-05-08 Thread Marcos Mendez

Hi,

I'm using solr 4.0 and I'm using an atomic update to increment a tdouble 3 
times with the same value (99.4). The third time it is incremented the values 
comes out to 298.25. Has anyone seen this error or how to fix it? 
Maybe I should use the regular double instead of a tdouble?

1 x weight_td:{set:0.0}

 3 x weight_td:{inc:99.4}

Schema information:

dynamicField name=*_d  type=double  indexed=true  stored=true/
dynamicField name=*_tdtype=tdouble  indexed=true  stored=true/
fieldType name=double class=solr.TrieDoubleField precisionStep=0 
positionIncrementGap=0/
fieldType name=tdouble class=solr.TrieDoubleField precisionStep=8 
positionIncrementGap=0/




Re: atomic updates w/ double field

2013-05-08 Thread Chris Hostetter

: I'm using solr 4.0 and I'm using an atomic update to increment a tdouble 
: 3 times with the same value (99.4). The third time it is incremented the 
: values comes out to 298.25. Has anyone seen this error or 
: how to fix it? Maybe I should use the regular double instead of a 
: tdouble?

this is the general nature of floating point math in most langauges -- 
including java...

http://stackoverflow.com/questions/322749/retain-precision-with-doubles-in-java
http://www.ibm.com/developerworks/java/library/j-math2/index.html

: 1 x weight_td:{set:0.0}
: 3 x weight_td:{inc:99.4}

public final class Temp {
public static double val = 0.0D;

public static void main(String[] args) {
for (int i = 0; i  5; i++) {
System.out.println(i + )  + val);
val += 99.4;
}
}
}
// OUTPUT...
// 0) 0.0
// 1) 99.4
// 2) 198.8
// 3) 298.25
// 4) 397.6





-Hoss


Re: Solr 4.3 fails in startup when dataimporthandler declaration is included in solrconfig.xml

2013-05-08 Thread Jan Høydahl
Why did you place solr.war in tomcat/lib?

Can you detail the specific errors you get when you place your DIH jars in 
solr-home/lib or instanceDir/lib?

--
Jan Høydahl, search solution architect
Cominvent AS - www.cominvent.com

8. mai 2013 kl. 21:15 skrev William Pierce evalsi...@hotmail.com:

 Thanks, Alex.  I have tried placing the jars in a folder under solrhome/lib 
 or under the instanceDir/lib with appropriate declarations in the 
 solrconfig.xml.  I can see the jars being loaded in the logs.  But neither 
 configuration seems to work.
 
 Bill
 
 -Original Message- From: Alexandre Rafalovitch
 Sent: Wednesday, May 08, 2013 11:12 AM
 To: solr-user@lucene.apache.org
 Subject: Re: Solr 4.3 fails in startup when dataimporthandler declaration is 
 included in solrconfig.xml
 
 Could be classloader issue. E.g. the jars in tomcat/lib not visible to
 whatever is trying to load DIH. Have you tried putting those jars
 somewhere else and using lib directive in solrconfig.xml instead to
 point to them?
 
 Regards,
  Alex.
 On Wed, May 8, 2013 at 2:07 PM, William Pierce evalsi...@hotmail.com wrote:
 I have gotten solr 4.3 up and running on tomcat7/windows7.  I have added the 
 two dataimport handler jars (found in the dist folder of my solr 4.3 
 download) to the tomcat/lib folder (where I also placed the solr.war).
 
 Then I added the following line to my solrconfig.xml:
 
 requestHandler name=/dataimport 
 class=org.apache.solr.handler.dataimport.DataImportHandler
lst name=defaults
  str name=configdih-config.xml/str
/lst
 /requestHandler
 
 When I start tomcat, I get the stack trace shown below (commenting out the 
 above lines causes tomcat  solr to start up just fine).
 
 
 
 Personal blog: http://blog.outerthoughts.com/
 LinkedIn: http://www.linkedin.com/in/alexandrerafalovitch
 - Time is the quality of nature that keeps events from happening all
 at once. Lately, it doesn't seem to be working.  (Anonymous  - via GTD
 book) 



disabled omitNorms in index but still see default value

2013-05-08 Thread srinir
Hi 

We have a huge index with more than 50 million documents. In the beginning
we disabled norms for some fields by setting omitNorms=true. Recently we
decided to add norms to few other fields and we removed omitNorms=true from
schema.

I read in solr forum that if one of the document in any segment has
omitNorms=true, during the next merge its copied to all documents. I am
confused by this design. what is the purpose behind this behavior ?

Looks like we need to wipe the index and start fresh, if we have to enable
omitNorms for a field.

Thanks
Srini



--
View this message in context: 
http://lucene.472066.n3.nabble.com/disabled-omitNorms-in-index-but-still-see-default-value-tp4061724.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Solr 4.3 fails in startup when dataimporthandler declaration is included in solrconfig.xml

2013-05-08 Thread William Pierce
The reason I placed the solr.war in tomcat/lib was -- I guess -- because 
that's way I had always done it since 1.3 days.  Our tomcat instance(s) run 
nothing other than solr - so that seemed as good a place as any.


The DIH jars that I placed in the tomcat/lib are: 
solr-dataimporthandler-4.3.0.jar and 
solr-dataimporthandler-extras-4.3.0.jar.  Are there any dependent jars that 
also need to be added that I am unaware of?


On the specific errors - I get a stack trace noted in the first email that 
began this thread but repeated here for convenience:


ERROR - 2013-05-08 10:43:48.185; org.apache.solr.core.CoreContainer; Unable 
to create core: collection1
org.apache.solr.common.SolrException: 
org/apache/solr/util/plugin/SolrCoreAware

   at org.apache.solr.core.SolrCore.init(SolrCore.java:821)
   at org.apache.solr.core.SolrCore.init(SolrCore.java:618)
   at 
org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:949)

   at org.apache.solr.core.CoreContainer.create(CoreContainer.java:984)
   at org.apache.solr.core.CoreContainer$2.call(CoreContainer.java:597)
   at org.apache.solr.core.CoreContainer$2.call(CoreContainer.java:592)
   at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
   at java.util.concurrent.FutureTask.run(Unknown Source)
   at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
   at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
   at java.util.concurrent.FutureTask.run(Unknown Source)
   at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
   at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
   at java.lang.Thread.run(Unknown Source)
Caused by: java.lang.NoClassDefFoundError: 
org/apache/solr/util/plugin/SolrCoreAware

   at java.lang.ClassLoader.defineClass1(Native Method)
   at java.lang.ClassLoader.defineClass(Unknown Source)
   at java.security.SecureClassLoader.defineClass(Unknown Source)
   at java.net.URLClassLoader.defineClass(Unknown Source)
   at java.net.URLClassLoader.access$100(Unknown Source)
   at java.net.URLClassLoader$1.run(Unknown Source)
   at java.net.URLClassLoader$1.run(Unknown Source)
   at java.security.AccessController.doPrivileged(Native Method)
   at java.net.URLClassLoader.findClass(Unknown Source)
   at java.lang.ClassLoader.loadClass(Unknown Source)
   at java.lang.ClassLoader.loadClass(Unknown Source)
   at java.lang.Class.forName0(Native Method)
   at java.lang.Class.forName(Unknown Source)
   at 
org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1700)

   at java.lang.ClassLoader.loadClass(Unknown Source)
   at java.net.FactoryURLClassLoader.loadClass(Unknown Source)
   at java.lang.ClassLoader.loadClass(Unknown Source)
   at java.net.FactoryURLClassLoader.loadClass(Unknown Source)
   at java.lang.ClassLoader.loadClass(Unknown Source)
   at java.lang.Class.forName0(Native Method)
   at java.lang.Class.forName(Unknown Source)
   at 
org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:448)
   at 
org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:396)

   at org.apache.solr.core.SolrCore.createInstance(SolrCore.java:518)
   at org.apache.solr.core.SolrCore.createRequestHandler(SolrCore.java:592)
   at 
org.apache.solr.core.RequestHandlers.initHandlersFromConfig(RequestHandlers.java:154)

   at org.apache.solr.core.SolrCore.init(SolrCore.java:758)
   ... 13 more
Caused by: java.lang.ClassNotFoundException: 
org.apache.solr.util.plugin.SolrCoreAware

   at java.net.URLClassLoader$1.run(Unknown Source)
   at java.net.URLClassLoader$1.run(Unknown Source)
   at java.security.AccessController.doPrivileged(Native Method)
   at java.net.URLClassLoader.findClass(Unknown Source)
   at java.lang.ClassLoader.loadClass(Unknown Source)
   at java.lang.ClassLoader.loadClass(Unknown Source)
   ... 40 more
ERROR - 2013-05-08 10:43:48.189; org.apache.solr.common.SolrException; 
null:org.apache.solr.common.SolrException: Unable to create core: 
collection1
   at 
org.apache.solr.core.CoreContainer.recordAndThrow(CoreContainer.java:1450)

   at org.apache.solr.core.CoreContainer.create(CoreContainer.java:993)
   at org.apache.solr.core.CoreContainer$2.call(CoreContainer.java:597)
   at org.apache.solr.core.CoreContainer$2.call(CoreContainer.java:592)
   at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
   at java.util.concurrent.FutureTask.run(Unknown Source)
   at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
   at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
   at java.util.concurrent.FutureTask.run(Unknown Source)
   at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
   at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
   at java.lang.Thread.run(Unknown Source)
Caused by: org.apache.solr.common.SolrException: 
org/apache/solr/util/plugin/SolrCoreAware

   at 

Solr 4.3.0 Error when sending a IsWithin Polygon query

2013-05-08 Thread solrnewbie
Hi,

I need help figuring why I keep getting the error below.  I am running the
example store core using Solr 4.3.0 on Centos.  When I use the solr web app
(http://localhost:8983/solr) to issue the following query against the
example docs:

In the q edit box:
*:*

In the fq edit box:
store:IsWithin(POLYGON((149.4023 -34.6072, 149.4023 -34.8690, 149.9022
-34.8690, 149.9022 -34.6072, 149.4023 -34.6072)))

I get the following error when I click on Execute Query which is also the
same error if I were to send the query as 

http://localhost:8983/solr/collection1/select?q=*:*fq=store:%22IsWithin(POLYGON((149.4023%20-34.6072,%20149.4023%20-34.8690,%20149.9022%20-34.8690,%20149.9022%20-34.6072,%20149.4023%20-34.6072)))%22:




?xml version=1.0 encoding=UTF-8?
response

lst name=responseHeader
  int name=status500/int
  int name=QTime4/int
  lst name=params
str name=indenttrue/str
str name=q*:*/str
str name=_1368060041286/str
str name=wtxml/str
str name=fqstore:IsWithin(POLYGON((149.4023 -34.6072, 149.4023
-34.8690, 149.9022 -34.8690, 149.9022 -34.6072, 149.4023 -34.6072)))/str
  /lst
/lst
lst name=error
  str name=msgFor input string: IsWithin(POLYGON((149.4023
-34.6072/str
  str name=tracejava.lang.NumberFormatException: For input string:
IsWithin(POLYGON((149.4023 -34.6072
at 
sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1242)
at java.lang.Double.parseDouble(Double.java:527)
at 
org.apache.solr.schema.TrieField.readableToIndexed(TrieField.java:396)
at org.apache.solr.schema.FieldType.getFieldQuery(FieldType.java:697)
at org.apache.solr.schema.TrieField.getFieldQuery(TrieField.java:353)
at org.apache.solr.schema.LatLonType.getFieldQuery(LatLonType.java:138)
at
org.apache.solr.parser.SolrQueryParserBase.getFieldQuery(SolrQueryParserBase.java:961)
at
org.apache.solr.parser.SolrQueryParserBase.getFieldQuery(SolrQueryParserBase.java:574)
at
org.apache.solr.parser.SolrQueryParserBase.handleQuotedTerm(SolrQueryParserBase.java:779)
at org.apache.solr.parser.QueryParser.Term(QueryParser.java:404)
at org.apache.solr.parser.QueryParser.Clause(QueryParser.java:186)
at org.apache.solr.parser.QueryParser.Query(QueryParser.java:108)
at org.apache.solr.parser.QueryParser.TopLevelQuery(QueryParser.java:97)
at
org.apache.solr.parser.SolrQueryParserBase.parse(SolrQueryParserBase.java:160)
at 
org.apache.solr.search.LuceneQParser.parse(LuceneQParserPlugin.java:72)
at org.apache.solr.search.QParser.getQuery(QParser.java:142)
at
org.apache.solr.handler.component.QueryComponent.prepare(QueryComponent.java:136)
at
org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:187)
at
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:135)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:1816)
at
org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:656)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:359)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:155)
at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1307)
at
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:453)
at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137)
at
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:560)
at
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231)
at
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1072)
at
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:382)
at
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)
at
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1006)
at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
at
org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)
at
org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:154)
at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
at org.eclipse.jetty.server.Server.handle(Server.java:365)
at
org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:485)
at
org.eclipse.jetty.server.BlockingHttpConnection.handleRequest(BlockingHttpConnection.java:53)
at
org.eclipse.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpConnection.java:926)
at
org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete(AbstractHttpConnection.java:988)
at 

Re: Indexing 4 different cores same machine

2013-05-08 Thread Shawn Heisey
On 5/8/2013 8:12 AM, marotosg wrote:
 Hi,
 
 I have 4 different cores in same machine. 
 Person core - 3 million docs   - 20 GB size
 Company Core  - 1 million docs - 2GB size
 Documents Core - 5 million docs - 5GB size
 Emails Core - 50,000 thousand  - 200 Mb
 
 While I am indexing data performance in server is almost the same if I am
 indexing only one core or all
 cores at the same time.
 
 I thought having different cores allow you to get different threads in
 parallel gaining some performance.
 Am I right?. My server is never reaching 100% CPU use. It always about 50%
 or even less.
 I had a look to I/O and it is not a problem.

You say that I/O performance appears to be good, but I/O is still likely
the bottleneck here.  When you are indexing them sequentially, each one
has access to full I/O resources, so each one goes at top speed.  If you
do them all at the same time, then they are competing for I/O resources,
so one can do its thing and the others have to wait until the I/O
scheduler can work on their requests.

In most cases, Solr is I/O bound, and the fact that it takes the same
amount of time either way is additional support for the idea that you
are limited by I/O resources, not CPU resources.  Your I/O system is
keeping up, which is good.  If it weren't keeping up, parallel indexing
would actually take even longer.

Thanks,
Shawn



Re: Oracle Timestamp in SOLR

2013-05-08 Thread Chris Hostetter

: I have a field with the type TIMESTAMP(6) in an oracle view.
...
: What is the best way to import it?
...
: This way works but I do not know if this is the best practise:
... 
:   TO_CHAR(LAST_ACTION_TIMESTAMP, '-MM-DD HH24:MI:SS') as LAT

instead of having your DB convert to a string, and then forcing DIH to 
parse that string, try asking your DB to cast to something that JDBC will 
respect as a Date object when DIH fetches the results

I don't know much about oracle, but perhaps something like...

SELECT ... CAST(LAST_ACTION_TIMESTAMP AS DATE) AS LAT


-Hoss


Re: Indexing Point Number

2013-05-08 Thread Jack Krupansky

I presume you meant to substitute the pattern and replacement for this case:

processor class=solr.RegexReplaceProcessorFactory
  str name=fieldNamecontent/str
  str name=fieldNametitle/str
  str name=pattern,/str
  str name=replacement./str
/processor

-- Jack Krupansky

-Original Message- 
From: Upayavira 
Sent: Wednesday, May 08, 2013 6:32 AM 
To: solr-user@lucene.apache.org 
Subject: Re: Indexing Point Number 


You could use a RegexReplaceProcessor in an update processor chain. From
the Javadoc:

processor class=solr.RegexReplaceProcessorFactory
  str name=fieldNamecontent/str
  str name=fieldNametitle/str
  str name=pattern\s+/str
  str name=replacement /str
/processor

This could replace the comma with a dot before it gets to be indexed.

Upayavira

On Wed, May 8, 2013, at 10:28 AM, Gora Mohanty wrote:

On 8 May 2013 14:48, be...@bkern.de be...@bkern.de wrote:
 I will index for example:
 field name=price19,95/field
 field name=price25,45/field

 I can only float with numbers with dots indexing.

I don't think that it is currently possible to change the decimal
separator. You should replace ',' with '.' during indexing, and
searching which should be fairly easy.

Regards,
Gora


Re: Search identifier fields containing blanks

2013-05-08 Thread Jack Krupansky
Geez, at this point, why not just escape the space with a backslash instead 
of all that extra cruft:


q=+location:bookshelf myFieldName:G\ 23/60\ 12

or

q=myFieldName:G\ 23/60\ 12 +location:bookshelf

-- Jack Krupansky

-Original Message- 
From: Upayavira

Sent: Wednesday, May 08, 2013 6:30 AM
To: solr-user@lucene.apache.org
Subject: Re: Search identifier fields containing blanks

If you're using the latest Solr, then you should be able to do it the
other way around:

q=+location:bookshelf {!term f=myFieldName}G 23/60 12

You might also find the trick I mentioned before useful:

q=+location:bookshelf {!term f=myFieldName v=$productCode}productCode=G
23/60 12

Upayavira

On Wed, May 8, 2013, at 11:19 AM, Silvio Hermann wrote:

that worked like a charme, but what must I do if want an additional field
to match e.g.



Best,

Silvio



On 05/08/2013 03:07 AM, Chris Hostetter wrote:

 : I am about to index identfier fields containing blanks (shelfmarks) 
 eg. G

 : 23/60 12
 : The field type is set to Solr.string. To get the exact matching hit 
 (the doc
 : with shelfmark mentioned above) the user must quote the search term. 
 Is there

 : a way to omit the quotes?

 whitespace has to be quoted when using the lucene QParser because it's a
 semanticly significant character that means end boolean query clause

 if you want to search for a literal string w/o needing any escaping, use
 the term QParser...

 {!term f=yourFieldName}G 23/60 12

 Of course, if you are putting this in a URL (ie: testing in a browser) 
 it

 still needs to be URL escaped...

 /select?q={!term+f=yourFieldName}G+23/60+12


 -Hoss


--
Silvio Hermann
Friedrich-Schiller-Universität Jena
Thüringer Universitäts- und Landesbibliothek
Bibliotheksplatz 2
07743 Jena
Phone: +49 3641 940019
FAX:   +49 3641 940022

http://www.historische-bestaende.de 




Per Shard Replication Factor

2013-05-08 Thread Steven Bower
Is it currently possible to have per-shard replication factor?

A bit of background on the use case...

If you are hashing content to shards by a known factor (lets say date
ranges, 12 shards, 1 per month) it might be the case that most of your
search traffic would be directed to one particular shard (eg. the current
month shard) and having increased query capacity in that shard would be
useful... this could be extended to many use cases such as data hashed by
organization, type, etc.

Thanks,

steve