Hudson build is back to normal : Lucene-trunk #1279

2010-09-05 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Lucene-trunk/1279/changes



-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Build failed in Hudson: Solr-3.x #94

2010-09-05 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Solr-3.x/94/changes

Changes:

[koji] fix broken javadoc link to lucene

[rmuir] add super.setup/teardown

[mikemccand] LUCENE-2631: fix small perf issues with String/TermOrdValComparator

[mikemccand] LUCENE-2598: more cutover to newDirectory(Random)

[mikemccand] merge props

--
[...truncated 6149 lines...]
[junit] junit.framework.AssertionFailedError: expected:0 but was:5
[junit] at 
org.apache.lucene.util.LuceneTestCaseJ4$LuceneTestCaseRunner.runChild(LuceneTestCaseJ4.java:624)
[junit] at 
org.apache.lucene.util.LuceneTestCaseJ4$LuceneTestCaseRunner.runChild(LuceneTestCaseJ4.java:619)
[junit] at 
org.apache.solr.handler.TestReplicationHandler.clearIndexWithReplication(TestReplicationHandler.java:92)
[junit] at 
org.apache.solr.handler.TestReplicationHandler.testIndexAndConfigReplication(TestReplicationHandler.java:230)
[junit] 
[junit] 
[junit] Testcase: 
testStopPoll(org.apache.solr.handler.TestReplicationHandler): FAILED
[junit] expected:497 but was:5
[junit] junit.framework.AssertionFailedError: expected:497 but was:5
[junit] at 
org.apache.lucene.util.LuceneTestCaseJ4$LuceneTestCaseRunner.runChild(LuceneTestCaseJ4.java:624)
[junit] at 
org.apache.lucene.util.LuceneTestCaseJ4$LuceneTestCaseRunner.runChild(LuceneTestCaseJ4.java:619)
[junit] at 
org.apache.solr.handler.TestReplicationHandler.testStopPoll(TestReplicationHandler.java:303)
[junit] 
[junit] 
[junit] Tests run: 7, Failures: 3, Errors: 0, Time elapsed: 145.865 sec
[junit] 
[junit] - Standard Output ---
[junit] NOTE: random locale of testcase 'testReplicateAfterWrite2Slave' 
was: zh_CN
[junit] NOTE: random timezone of testcase 'testReplicateAfterWrite2Slave' 
was: America/Yakutat
[junit] NOTE: random locale of testcase 'testIndexAndConfigReplication' 
was: zh_CN
[junit] NOTE: random timezone of testcase 'testIndexAndConfigReplication' 
was: America/Yakutat
[junit] NOTE: random locale of testcase 'testStopPoll' was: zh_CN
[junit] NOTE: random timezone of testcase 'testStopPoll' was: 
America/Yakutat
[junit] -  ---
[junit] TEST org.apache.solr.handler.TestReplicationHandler FAILED
[junit] Testsuite: org.apache.solr.handler.XmlUpdateRequestHandlerTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.813 sec
[junit] 
[junit] Testsuite: org.apache.solr.handler.admin.LukeRequestHandlerTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.796 sec
[junit] 
[junit] Testsuite: org.apache.solr.handler.admin.SystemInfoHandlerTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.005 sec
[junit] 
[junit] Testsuite: 
org.apache.solr.handler.component.DistributedSpellCheckComponentTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 12.141 sec
[junit] 
[junit] Testsuite: 
org.apache.solr.handler.component.DistributedTermsComponentTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 10.329 sec
[junit] 
[junit] Testsuite: 
org.apache.solr.handler.component.QueryElevationComponentTest
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 1.001 sec
[junit] 
[junit] Testsuite: org.apache.solr.handler.component.SearchHandlerTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.763 sec
[junit] 
[junit] Testsuite: org.apache.solr.handler.component.SpellCheckComponentTest
[junit] Tests run: 10, Failures: 0, Errors: 0, Time elapsed: 1.153 sec
[junit] 
[junit] Testsuite: org.apache.solr.handler.component.StatsComponentTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 3.227 sec
[junit] 
[junit] Testsuite: org.apache.solr.handler.component.TermVectorComponentTest
[junit] Tests run: 5, Failures: 0, Errors: 0, Time elapsed: 0.875 sec
[junit] 
[junit] Testsuite: org.apache.solr.handler.component.TermsComponentTest
[junit] Tests run: 13, Failures: 0, Errors: 0, Time elapsed: 1.012 sec
[junit] 
[junit] Testsuite: org.apache.solr.highlight.FastVectorHighlighterTest
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.813 sec
[junit] 
[junit] Testsuite: org.apache.solr.highlight.HighlighterConfigTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.748 sec
[junit] 
[junit] Testsuite: org.apache.solr.highlight.HighlighterTest
[junit] Tests run: 23, Failures: 0, Errors: 0, Time elapsed: 2.26 sec
[junit] 
[junit] Testsuite: org.apache.solr.request.JSONWriterTest
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 0.799 sec
[junit] 
[junit] Testsuite: org.apache.solr.request.SimpleFacetsTest
[junit] Tests run: 7, Failures: 0, Errors: 0, Time elapsed: 4.791 sec
[junit] 
[junit] Testsuite: 

Build failed in Hudson: Solr-trunk #1238

2010-09-05 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Solr-trunk/1238/changes

Changes:

[koji] fix broken javadoc link to lucene

[rmuir] add super.setup/teardown

[mikemccand] LUCENE-2631: fix small perf issues with String/TermOrdValComparator

[mikemccand] LUCENE-2598: more cutover to newDirectory(Random)

--
[...truncated 6206 lines...]
[junit] 
[junit] Testcase: 
testStopPoll(org.apache.solr.handler.TestReplicationHandler): FAILED
[junit] expected:497 but was:5
[junit] junit.framework.AssertionFailedError: expected:497 but was:5
[junit] at 
org.apache.lucene.util.LuceneTestCaseJ4$LuceneTestCaseRunner.runChild(LuceneTestCaseJ4.java:744)
[junit] at 
org.apache.lucene.util.LuceneTestCaseJ4$LuceneTestCaseRunner.runChild(LuceneTestCaseJ4.java:739)
[junit] at 
org.apache.solr.handler.TestReplicationHandler.testStopPoll(TestReplicationHandler.java:303)
[junit] 
[junit] 
[junit] Tests run: 7, Failures: 3, Errors: 0, Time elapsed: 141.65 sec
[junit] 
[junit] - Standard Output ---
[junit] NOTE: random codec of testcase 'testReplicateAfterWrite2Slave' was: 
MockSep
[junit] NOTE: random locale of testcase 'testReplicateAfterWrite2Slave' 
was: ms_MY
[junit] NOTE: random timezone of testcase 'testReplicateAfterWrite2Slave' 
was: Asia/Singapore
[junit] NOTE: random codec of testcase 'testIndexAndConfigReplication' was: 
MockSep
[junit] NOTE: random locale of testcase 'testIndexAndConfigReplication' 
was: ms_MY
[junit] NOTE: random timezone of testcase 'testIndexAndConfigReplication' 
was: Asia/Singapore
[junit] NOTE: random codec of testcase 'testStopPoll' was: MockSep
[junit] NOTE: random locale of testcase 'testStopPoll' was: ms_MY
[junit] NOTE: random timezone of testcase 'testStopPoll' was: Asia/Singapore
[junit] -  ---
[junit] TEST org.apache.solr.handler.TestReplicationHandler FAILED
[junit] Testsuite: org.apache.solr.handler.XmlUpdateRequestHandlerTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.631 sec
[junit] 
[junit] Testsuite: org.apache.solr.handler.admin.LukeRequestHandlerTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.694 sec
[junit] 
[junit] Testsuite: org.apache.solr.handler.admin.SystemInfoHandlerTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.004 sec
[junit] 
[junit] Testsuite: org.apache.solr.handler.component.DebugComponentTest
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.926 sec
[junit] 
[junit] Testsuite: 
org.apache.solr.handler.component.DistributedDebugComponentTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 9.112 sec
[junit] 
[junit] Testsuite: 
org.apache.solr.handler.component.DistributedSpellCheckComponentTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 10.684 sec
[junit] 
[junit] Testsuite: 
org.apache.solr.handler.component.DistributedTermsComponentTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 9.176 sec
[junit] 
[junit] Testsuite: 
org.apache.solr.handler.component.QueryElevationComponentTest
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.901 sec
[junit] 
[junit] Testsuite: org.apache.solr.handler.component.SearchHandlerTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.629 sec
[junit] 
[junit] Testsuite: org.apache.solr.handler.component.SpellCheckComponentTest
[junit] Tests run: 10, Failures: 0, Errors: 0, Time elapsed: 1.022 sec
[junit] 
[junit] Testsuite: org.apache.solr.handler.component.StatsComponentTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 2.802 sec
[junit] 
[junit] Testsuite: org.apache.solr.handler.component.TermVectorComponentTest
[junit] Tests run: 5, Failures: 0, Errors: 0, Time elapsed: 0.719 sec
[junit] 
[junit] Testsuite: org.apache.solr.handler.component.TermsComponentTest
[junit] Tests run: 13, Failures: 0, Errors: 0, Time elapsed: 0.818 sec
[junit] 
[junit] Testsuite: org.apache.solr.highlight.FastVectorHighlighterTest
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.7 sec
[junit] 
[junit] Testsuite: org.apache.solr.highlight.HighlighterConfigTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.704 sec
[junit] 
[junit] Testsuite: org.apache.solr.highlight.HighlighterTest
[junit] Tests run: 23, Failures: 0, Errors: 0, Time elapsed: 1.897 sec
[junit] 
[junit] Testsuite: org.apache.solr.request.JSONWriterTest
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 0.678 sec
[junit] 
[junit] Testsuite: org.apache.solr.request.SimpleFacetsTest
[junit] Tests run: 22, Failures: 0, Errors: 0, Time elapsed: 6.127 sec
[junit] 
[junit] Testsuite: 

RE: Build failed in Hudson: Solr-trunk #1238

2010-09-05 Thread Uwe Schindler
This whole test is somehow broken, as it assumes that the class files are real 
files in file system. This may not be the case, e.g. when in JAR files, in a 
web application,…. The test should read the files from the ClassLoader and not 
using file system functions. We already had a problem with this test because of 
the whitespace in my username on my windows account (*g*), I fixed it. But 
altogether I would remove the test in complete and replace by a better test 
that uses classloader functions to inspect the classpath.

 

As far as I know, there are classloader methods to find all classes in one 
package (which is similar like a files.list() and endsWith(“.class”)).

 

Uwe

 

-

Uwe Schindler

H.-H.-Meier-Allee 63, D-28213 Bremen

 http://www.thetaphi.de/ http://www.thetaphi.de

eMail: u...@thetaphi.de

 

From: Robert Muir [mailto:rcm...@gmail.com] 
Sent: Sunday, September 05, 2010 2:37 PM
To: dev@lucene.apache.org
Subject: Re: Build failed in Hudson: Solr-trunk #1238

 

I think this is a real bug, you can reproduce with 'ant test 
-Dtests.threadspercpu=0' to force all tests to run in a single jvm.

 

the problem is some static or similar in ReplicationHandler, the following will 
workaround it, but i cant find the static:

 

Index: solr/src/test/org/apache/solr/SolrInfoMBeanTest.java

===

--- solr/src/test/org/apache/solr/SolrInfoMBeanTest.java(revision 
992411)

+++ solr/src/test/org/apache/solr/SolrInfoMBeanTest.java(working copy)

@@ -97,7 +97,7 @@

   if (directory.exists()) {

 String[] files = directory.list();

 for (String file : files) {

-  if (file.endsWith(.class)) {

+  if (file.endsWith(.class)  !file.contains(ReplicationHandler)) 
{

  classes.add(Class.forName(pckgname + '.' + file.substring(0, 
file.length() - 6)));

   }

 }

 

 

On Sun, Sep 5, 2010 at 4:21 AM, Apache Hudson Server hud...@hudson.apache.org 
wrote:

See https://hudson.apache.org/hudson/job/Solr-trunk/1238/changes

Changes:

[koji] fix broken javadoc link to lucene

[rmuir] add super.setup/teardown

[mikemccand] LUCENE-2631: fix small perf issues with String/TermOrdValComparator

[mikemccand] LUCENE-2598: more cutover to newDirectory(Random)

--
[...truncated 6206 lines...]
   [junit]
   [junit] Testcase: 
testStopPoll(org.apache.solr.handler.TestReplicationHandler): FAILED
   [junit] expected:497 but was:5
   [junit] junit.framework.AssertionFailedError: expected:497 but was:5
   [junit] at 
org.apache.lucene.util.LuceneTestCaseJ4$LuceneTestCaseRunner.runChild(LuceneTestCaseJ4.java:744)
   [junit] at 
org.apache.lucene.util.LuceneTestCaseJ4$LuceneTestCaseRunner.runChild(LuceneTestCaseJ4.java:739)
   [junit] at 
org.apache.solr.handler.TestReplicationHandler.testStopPoll(TestReplicationHandler.java:303)
   [junit]
   [junit]
   [junit] Tests run: 7, Failures: 3, Errors: 0, Time elapsed: 141.65 sec
   [junit]
   [junit] - Standard Output ---
   [junit] NOTE: random codec of testcase 'testReplicateAfterWrite2Slave' was: 
MockSep
   [junit] NOTE: random locale of testcase 'testReplicateAfterWrite2Slave' was: 
ms_MY
   [junit] NOTE: random timezone of testcase 'testReplicateAfterWrite2Slave' 
was: Asia/Singapore
   [junit] NOTE: random codec of testcase 'testIndexAndConfigReplication' was: 
MockSep
   [junit] NOTE: random locale of testcase 'testIndexAndConfigReplication' was: 
ms_MY
   [junit] NOTE: random timezone of testcase 'testIndexAndConfigReplication' 
was: Asia/Singapore
   [junit] NOTE: random codec of testcase 'testStopPoll' was: MockSep
   [junit] NOTE: random locale of testcase 'testStopPoll' was: ms_MY
   [junit] NOTE: random timezone of testcase 'testStopPoll' was: Asia/Singapore
   [junit] -  ---
   [junit] TEST org.apache.solr.handler.TestReplicationHandler FAILED
   [junit] Testsuite: org.apache.solr.handler.XmlUpdateRequestHandlerTest
   [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.631 sec
   [junit]
   [junit] Testsuite: org.apache.solr.handler.admin.LukeRequestHandlerTest
   [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.694 sec
   [junit]
   [junit] Testsuite: org.apache.solr.handler.admin.SystemInfoHandlerTest
   [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.004 sec
   [junit]
   [junit] Testsuite: org.apache.solr.handler.component.DebugComponentTest
   [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.926 sec
   [junit]
   [junit] Testsuite: 
org.apache.solr.handler.component.DistributedDebugComponentTest
   [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 9.112 sec
   [junit]
   [junit] Testsuite: 
org.apache.solr.handler.component.DistributedSpellCheckComponentTest
   [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 10.684 sec
   [junit]
   

[jira] Created: (SOLR-2102) JdbcDataSource convertType attribute is not working with implicit fields

2010-09-05 Thread Alexey Serba (JIRA)
JdbcDataSource convertType attribute is not working with implicit fields


 Key: SOLR-2102
 URL: https://issues.apache.org/jira/browse/SOLR-2102
 Project: Solr
  Issue Type: Bug
  Components: contrib - DataImportHandler
Affects Versions: 1.4.1
Reporter: Alexey Serba


JdbcDataSource convertType attribute doesn't take any effect on implicit fields 
( fields that are not listed in [field declaration 
section|http://wiki.apache.org/solr/DataImportHandler#Field_declarations] ). 

For example you might have the following configuration:

{noformat:title=dataconfig}
?xml version=1.0 encoding=UTF-8?
dataConfig
  dataSource batchSize=-1 convertType=true driver=com.mysql.jdbc.Driver
password=pass url=jdbc:mysql://localhost/test user=root/
  document name=items
entity query=SELECT title, body, tm FROM articles order by title desc
/entity
  /document
{noformat}

where 
* tm is timestamp in mysql database
* tm is date in schema.xml

Because field _tm_ is not explicitly stated in fields declaration _convertType_ 
attribute doesn't take any effect and as a result you would get the following 
exception:

{noformat:title=convertType exception}
Sep 6, 2010 2:22:09 AM org.apache.solr.handler.dataimport.SolrWriter upload
WARNING: Error creating document : SolrInputDocument[{body=body(1.0)={Apache 
Lucene is a free/open source information retrieval software library, originally 
created in Java by Doug Cutting.}, tm=tm(1.0)={2010-09-06 02:06:25.0}, 
title=title(1.0)={Lucene}}]
org.apache.solr.common.SolrException: Error while creating field 
'tm{type=date,default=NOW,properties=indexed,stored,omitNorms,sortMissingLast}' 
from value '2010-09-06 02:06:25.0'
at org.apache.solr.schema.FieldType.createField(FieldType.java:242)
at org.apache.solr.schema.SchemaField.createField(SchemaField.java:94)
at 
org.apache.solr.update.DocumentBuilder.addField(DocumentBuilder.java:204)
at 
org.apache.solr.update.DocumentBuilder.toDocument(DocumentBuilder.java:277)
at 
org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:60)
at 
org.apache.solr.handler.dataimport.SolrWriter.upload(SolrWriter.java:75)
at 
org.apache.solr.handler.dataimport.DataImportHandler$1.upload(DataImportHandler.java:292)
at 
org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:618)
at 
org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:260)
at 
org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:184)
at 
org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:334)
at 
org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:392)
at 
org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:373)
Caused by: org.apache.solr.common.SolrException: Invalid Date 
String:'2010-09-06 02:06:25.0'
at org.apache.solr.schema.DateField.parseMath(DateField.java:166)
at org.apache.solr.schema.DateField.toInternal(DateField.java:136)
at org.apache.solr.schema.FieldType.createField(FieldType.java:240)
... 12 more
{noformat}

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] Updated: (SOLR-2102) JdbcDataSource convertType attribute is not working with implicit fields

2010-09-05 Thread Alexey Serba (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-2102?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alexey Serba updated SOLR-2102:
---

Attachment: SOLR-2102.patch

 JdbcDataSource convertType attribute is not working with implicit fields
 

 Key: SOLR-2102
 URL: https://issues.apache.org/jira/browse/SOLR-2102
 Project: Solr
  Issue Type: Bug
  Components: contrib - DataImportHandler
Affects Versions: 1.4.1
Reporter: Alexey Serba
 Attachments: SOLR-2102.patch


 JdbcDataSource convertType attribute doesn't take any effect on implicit 
 fields ( fields that are not listed in [field declaration 
 section|http://wiki.apache.org/solr/DataImportHandler#Field_declarations] ). 
 For example you might have the following configuration:
 {noformat:title=dataconfig}
 ?xml version=1.0 encoding=UTF-8?
 dataConfig
   dataSource batchSize=-1 convertType=true driver=com.mysql.jdbc.Driver
 password=pass url=jdbc:mysql://localhost/test user=root/
   document name=items
 entity query=SELECT title, body, tm FROM articles order by title desc
 /entity
   /document
 {noformat}
 where 
 * tm is timestamp in mysql database
 * tm is date in schema.xml
 Because field _tm_ is not explicitly stated in fields declaration 
 _convertType_ attribute doesn't take any effect and as a result you would get 
 the following exception:
 {noformat:title=convertType exception}
 Sep 6, 2010 2:22:09 AM org.apache.solr.handler.dataimport.SolrWriter upload
 WARNING: Error creating document : SolrInputDocument[{body=body(1.0)={Apache 
 Lucene is a free/open source information retrieval software library, 
 originally created in Java by Doug Cutting.}, tm=tm(1.0)={2010-09-06 
 02:06:25.0}, title=title(1.0)={Lucene}}]
 org.apache.solr.common.SolrException: Error while creating field 
 'tm{type=date,default=NOW,properties=indexed,stored,omitNorms,sortMissingLast}'
  from value '2010-09-06 02:06:25.0'
 at org.apache.solr.schema.FieldType.createField(FieldType.java:242)
 at org.apache.solr.schema.SchemaField.createField(SchemaField.java:94)
 at 
 org.apache.solr.update.DocumentBuilder.addField(DocumentBuilder.java:204)
 at 
 org.apache.solr.update.DocumentBuilder.toDocument(DocumentBuilder.java:277)
 at 
 org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:60)
 at 
 org.apache.solr.handler.dataimport.SolrWriter.upload(SolrWriter.java:75)
 at 
 org.apache.solr.handler.dataimport.DataImportHandler$1.upload(DataImportHandler.java:292)
 at 
 org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:618)
 at 
 org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:260)
 at 
 org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:184)
 at 
 org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:334)
 at 
 org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:392)
 at 
 org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:373)
 Caused by: org.apache.solr.common.SolrException: Invalid Date 
 String:'2010-09-06 02:06:25.0'
 at org.apache.solr.schema.DateField.parseMath(DateField.java:166)
 at org.apache.solr.schema.DateField.toInternal(DateField.java:136)
 at org.apache.solr.schema.FieldType.createField(FieldType.java:240)
 ... 12 more
 {noformat}

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] Commented: (SOLR-2102) JdbcDataSource convertType attribute is not working with implicit fields

2010-09-05 Thread Lance Norskog (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-2102?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=12906405#action_12906405
 ] 

Lance Norskog commented on SOLR-2102:
-

The (equals 'int or sint') etc. code dates from 1.3. You might wish to update 
these clauses to include the 1.4 'pint/tint' etc. versions. Also date/sdate/ 
pdate/tdate.

This code does not support wildcarded field names, which is OK. 

Lance

 JdbcDataSource convertType attribute is not working with implicit fields
 

 Key: SOLR-2102
 URL: https://issues.apache.org/jira/browse/SOLR-2102
 Project: Solr
  Issue Type: Bug
  Components: contrib - DataImportHandler
Affects Versions: 1.4.1
Reporter: Alexey Serba
 Attachments: SOLR-2102.patch


 JdbcDataSource convertType attribute doesn't take any effect on implicit 
 fields ( fields that are not listed in [field declaration 
 section|http://wiki.apache.org/solr/DataImportHandler#Field_declarations] ). 
 For example you might have the following configuration:
 {noformat:title=dataconfig}
 ?xml version=1.0 encoding=UTF-8?
 dataConfig
   dataSource batchSize=-1 convertType=true driver=com.mysql.jdbc.Driver
 password=pass url=jdbc:mysql://localhost/test user=root/
   document name=items
 entity query=SELECT title, body, tm FROM articles order by title desc
 /entity
   /document
 {noformat}
 where 
 * tm is timestamp in mysql database
 * tm is date in schema.xml
 Because field _tm_ is not explicitly stated in fields declaration 
 _convertType_ attribute doesn't take any effect and as a result you would get 
 the following exception:
 {noformat:title=convertType exception}
 Sep 6, 2010 2:22:09 AM org.apache.solr.handler.dataimport.SolrWriter upload
 WARNING: Error creating document : SolrInputDocument[{body=body(1.0)={Apache 
 Lucene is a free/open source information retrieval software library, 
 originally created in Java by Doug Cutting.}, tm=tm(1.0)={2010-09-06 
 02:06:25.0}, title=title(1.0)={Lucene}}]
 org.apache.solr.common.SolrException: Error while creating field 
 'tm{type=date,default=NOW,properties=indexed,stored,omitNorms,sortMissingLast}'
  from value '2010-09-06 02:06:25.0'
 at org.apache.solr.schema.FieldType.createField(FieldType.java:242)
 at org.apache.solr.schema.SchemaField.createField(SchemaField.java:94)
 at 
 org.apache.solr.update.DocumentBuilder.addField(DocumentBuilder.java:204)
 at 
 org.apache.solr.update.DocumentBuilder.toDocument(DocumentBuilder.java:277)
 at 
 org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:60)
 at 
 org.apache.solr.handler.dataimport.SolrWriter.upload(SolrWriter.java:75)
 at 
 org.apache.solr.handler.dataimport.DataImportHandler$1.upload(DataImportHandler.java:292)
 at 
 org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:618)
 at 
 org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:260)
 at 
 org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:184)
 at 
 org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:334)
 at 
 org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:392)
 at 
 org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:373)
 Caused by: org.apache.solr.common.SolrException: Invalid Date 
 String:'2010-09-06 02:06:25.0'
 at org.apache.solr.schema.DateField.parseMath(DateField.java:166)
 at org.apache.solr.schema.DateField.toInternal(DateField.java:136)
 at org.apache.solr.schema.FieldType.createField(FieldType.java:240)
 ... 12 more
 {noformat}

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] Resolved: (SOLR-2099) Add ability to throttle rsync based replication using rsync option --bwlimit

2010-09-05 Thread Koji Sekiguchi (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-2099?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Koji Sekiguchi resolved SOLR-2099.
--

Resolution: Fixed

trunk: Committed revision 992913.
branch_3x: Committed revision 992915.

 Add ability to throttle rsync based replication using rsync option --bwlimit
 

 Key: SOLR-2099
 URL: https://issues.apache.org/jira/browse/SOLR-2099
 Project: Solr
  Issue Type: Improvement
  Components: replication (scripts)
Affects Versions: 1.4.1
 Environment: RHEL 5.x
Reporter: Brandon Evans
Assignee: Koji Sekiguchi
Priority: Trivial
 Fix For: 3.1, 4.0

 Attachments: solr-1.4.0-rsyncd_bwlimit.patch

   Original Estimate: 24h
  Remaining Estimate: 24h

 This patch allows for the use of the new option 'rsyncd_bwlimit' in 
 scripts.conf.
 rsyncd_bwlimit adds  --bwlimit=XX to the rsyncd command line.  If the option 
 is not specified bwlimit is set to 0 by default.
--bwlimit=KBPS
   This  option allows you to specify a maximum transfer rate in 
 kilobytes per second. This option is most effective when using rsync with 
 large files (several
   megabytes and up). Due to the nature of rsync transfers, blocks 
 of data are sent, then if rsync determines the transfer was too fast, it  
 will  wait  before
   sending the next data block. The result is an average transfer 
 rate equaling the specified limit. A value of zero specifies no limit.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] Commented: (SOLR-2100) Fix for saving commit points during java-based backups

2010-09-05 Thread Yonik Seeley (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-2100?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=12906408#action_12906408
 ] 

Yonik Seeley commented on SOLR-2100:


Thanks for the patch Peter!  Such a small patch... but I've been trying to 
puzzle out all of the possible ramifications (and going back to puzzle through 
some of the replication code).

saveCommitPoint() (which obviously did nothing before this) is called in the 
postCommit (and postOptimize) events.
This doesn't even seem necessary for replication, since SolrDeletionPolicy 
always saves the last commit point and the last optimized point (if configured 
to do so, or if replicate on optimize is configured).  Once replication has 
started, a reservation scheme is used rather than saving a commit point forever.

Also, if one configures replication onCommit and onOptimize, then the event 
callback code has some bugs: both refer to indexCommitPoint, and close the 
previous one.  So if we did rely on saveCommitPoint, a commit after an optimize 
would release the optimized commit point.

 Fix for saving commit points during java-based backups
 --

 Key: SOLR-2100
 URL: https://issues.apache.org/jira/browse/SOLR-2100
 Project: Solr
  Issue Type: Bug
  Components: replication (java)
Affects Versions: 1.4, 1.4.1
Reporter: Peter Sturge
Priority: Minor
 Fix For: 1.4.2

 Attachments: SOLR-2100.PATCH

   Original Estimate: 0h
  Remaining Estimate: 0h

 This patch fixes the saving of commit points during backup operations.
 This fixes the perviously commited (for 1.4) SOLR-1475 patch.
 1. In IndexDeletionPolicyWrapper.java, commit points are not saved to the 
 'savedCommits' map.
 2. Also, the testing of the presence of a commit point uses the contains() 
 method instead of containsKey().
 The result of this means that backups for anything but toy indexes fail, 
 because the commit points are deleted (after 10s) before the full backup is 
 completed.
 This patch addresses these 2 issues.
 Tested with 1.4.1 release trunk, but should also work fine with 1.4.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] Created: (SOLR-2103) Internal fields of a compound type should not be returned to user by default

2010-09-05 Thread Lance Norskog (JIRA)
Internal fields of a compound type should not be returned to user by default


 Key: SOLR-2103
 URL: https://issues.apache.org/jira/browse/SOLR-2103
 Project: Solr
  Issue Type: Improvement
  Components: Schema and Analysis, search
Reporter: Lance Norskog


A search on records with a compound type (location in my use case) with 
'fl=*' returns the internal fields (location_0_d,location_1_d) along with 
'location'.

These internal fields are implementation details and should not be returned 
with the wildcard field set. These fields should be visible if the fl= 
parameter specifically asks for them. This syntax would make sense: 
fl=*,location_0_d,location_1_d. Meaning, return all the fields described in 
the schema, and also return the internal fields.


-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] Commented: (LUCENE-2573) Tiered flushing of DWPTs by RAM with low/high water marks

2010-09-05 Thread Jason Rutherglen (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-2573?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=12906427#action_12906427
 ] 

Jason Rutherglen commented on LUCENE-2573:
--

It looks like StoredFieldsWriter is reused after flushing a DWPT, however we're 
not resetting isClosed.

 Tiered flushing of DWPTs by RAM with low/high water marks
 -

 Key: LUCENE-2573
 URL: https://issues.apache.org/jira/browse/LUCENE-2573
 Project: Lucene - Java
  Issue Type: Improvement
Reporter: Michael Busch
Assignee: Michael Busch
Priority: Minor
 Fix For: Realtime Branch


 Now that we have DocumentsWriterPerThreads we need to track total consumed 
 RAM across all DWPTs.
 A flushing strategy idea that was discussed in LUCENE-2324 was to use a 
 tiered approach:  
 - Flush the first DWPT at a low water mark (e.g. at 90% of allowed RAM)
 - Flush all DWPTs at a high water mark (e.g. at 110%)
 - Use linear steps in between high and low watermark:  E.g. when 5 DWPTs are 
 used, flush at 90%, 95%, 100%, 105% and 110%.
 Should we allow the user to configure the low and high water mark values 
 explicitly using total values (e.g. low water mark at 120MB, high water mark 
 at 140MB)?  Or shall we keep for simplicity the single setRAMBufferSizeMB() 
 config method and use something like 90% and 110% for the water marks?

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org