Re: jhighlight-1.0 contains LGPL-only files
You are right -both projects need to remove it, although it might be easier to work with Tika to fix that and then upgrade again. Upayavira On Fri, Mar 20, 2015, at 05:26 AM, Shai Erera wrote: Sorry for the spam, just wanted to note that this dependency was added by Steve in SOLR-6130 to resolve improper Tika 1.4-1.5 upgrade. The core issue lies with Tika IMO (they shouldn't rely on LGPL code too I believe), but I am not sure if it's OK that we distribute this .jar ourselves. Shai On Fri, Mar 20, 2015 at 7:17 AM, Shai Erera ser...@gmail.com wrote: One update, I did find this dependency is explicitly set in solr/contrib/extraction/ivy.xml, under the Tika dependencies section: !-- Tika dependencies - see http://tika.apache.org/1.3/gettingstarted.html#Using_Tika_as_a_Maven_dependency -- !-- When upgrading Tika, upgrade dependencies versions and add any new ones (except slf4j-api, commons-codec, commons-logging, commons-httpclient, geronimo-stax-api_1.0_spec, jcip-annotations, xml-apis, asm) WARNING: Don't add netcdf / unidataCommon (partially LGPL code) -- ... dependency org=com.uwyn name=jhighlight rev=${/com.uwyn/jhighlight} conf=compile/ So it does seem like needed by Tika only and I guess it's a runtime dependency, so if we don't want to release this LGPL library, we can omit it and put a section in the NOTICE file? Shai On Fri, Mar 20, 2015 at 7:11 AM, Shai Erera ser...@gmail.com wrote: Hi Solr's contrib/extraction contains jhighlight-1.0.jar which declares itself as dual CDDL or LGPL license. However, some of its classes are distributed only under LGPL, e.g. com.uwyn.jhighlight.highlighter. CppHighlighter.java GroovyHighlighter.java JavaHighlighter.java XmlHighlighter.java I downloaded the sources from Maven ( http://search.maven.org/remotecontent?filepath=com/uwyn/jhighlight/1.0/jhighlight-1.0-sources.jar) to confirm that, and also found this SVN repo: http://svn.rifers.org/jhighlight/tags/release-1.0, though the project's website seems to not exist anymore (https://jhighlight.dev.java.net/). I didn't find any direct usage of it in our code, so I guess it's probably needed by a 3rd party dependency, such as Tika. Therefore if we e.g. omit it, things will compile, but may fail at runtime. Is it OK that we distribute this .jar? Shai - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-5.x-Windows (64bit/jdk1.7.0_76) - Build # 4452 - Failure!
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-5.x-Windows/4452/ Java: 64bit/jdk1.7.0_76 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC 1 tests failed. FAILED: org.apache.solr.core.TestArbitraryIndexDir.testLoadNewIndexDir Error Message: Exception during query Stack Trace: java.lang.RuntimeException: Exception during query at __randomizedtesting.SeedInfo.seed([558F9FCDB1250F7B:BCD524F52FBC9FD3]:0) at org.apache.solr.SolrTestCaseJ4.assertQ(SolrTestCaseJ4.java:794) at org.apache.solr.core.TestArbitraryIndexDir.testLoadNewIndexDir(TestArbitraryIndexDir.java:128) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.RuntimeException: REQUEST FAILED: xpath=*[count(//doc)=1] xml response was: ?xml version=1.0 encoding=UTF-8? response lst name=responseHeaderint name=status0/intint name=QTime1/int/lstresult name=response numFound=0 start=0/result /response request
Re: jhighlight-1.0 contains LGPL-only files
I have created a ticket: TIKA-1581. ManifoldCF also has a Tika dependency, so thank you for noting the problem. Karl On Fri, Mar 20, 2015 at 4:03 AM, Upayavira u...@odoko.co.uk wrote: You are right -both projects need to remove it, although it might be easier to work with Tika to fix that and then upgrade again. Upayavira On Fri, Mar 20, 2015, at 05:26 AM, Shai Erera wrote: Sorry for the spam, just wanted to note that this dependency was added by Steve in SOLR-6130 to resolve improper Tika 1.4-1.5 upgrade. The core issue lies with Tika IMO (they shouldn't rely on LGPL code too I believe), but I am not sure if it's OK that we distribute this .jar ourselves. Shai On Fri, Mar 20, 2015 at 7:17 AM, Shai Erera ser...@gmail.com wrote: One update, I did find this dependency is explicitly set in solr/contrib/extraction/ivy.xml, under the Tika dependencies section: !-- Tika dependencies - see http://tika.apache.org/1.3/gettingstarted.html#Using_Tika_as_a_Maven_dependency -- !-- When upgrading Tika, upgrade dependencies versions and add any new ones (except slf4j-api, commons-codec, commons-logging, commons-httpclient, geronimo-stax-api_1.0_spec, jcip-annotations, xml-apis, asm) WARNING: Don't add netcdf / unidataCommon (partially LGPL code) -- ... dependency org=com.uwyn name=jhighlight rev=${/com.uwyn/jhighlight} conf=compile/ So it does seem like needed by Tika only and I guess it's a runtime dependency, so if we don't want to release this LGPL library, we can omit it and put a section in the NOTICE file? Shai On Fri, Mar 20, 2015 at 7:11 AM, Shai Erera ser...@gmail.com wrote: Hi Solr's contrib/extraction contains jhighlight-1.0.jar which declares itself as dual CDDL or LGPL license. However, some of its classes are distributed only under LGPL, e.g. com.uwyn.jhighlight.highlighter. CppHighlighter.java GroovyHighlighter.java JavaHighlighter.java XmlHighlighter.java I downloaded the sources from Maven ( http://search.maven.org/remotecontent?filepath=com/uwyn/jhighlight/1.0/jhighlight-1.0-sources.jar ) to confirm that, and also found this SVN repo: http://svn.rifers.org/jhighlight/tags/release-1.0, though the project's website seems to not exist anymore (https://jhighlight.dev.java.net/ ). I didn't find any direct usage of it in our code, so I guess it's probably needed by a 3rd party dependency, such as Tika. Therefore if we e.g. omit it, things will compile, but may fail at runtime. Is it OK that we distribute this .jar? Shai - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-6226) Add interval iterators to Scorer
[ https://issues.apache.org/jira/browse/LUCENE-6226?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14371038#comment-14371038 ] Alan Woodward commented on LUCENE-6226: --- At the moment, IntervalScorer uses the same algorithm as SpanScorer - calculate a sloppy freq using the width of each top-level interval, sum them up, and pass that to SimScorer.score(). It would be interesting to look at other possibilities, but I think that should be done in a separate issue? Add interval iterators to Scorer Key: LUCENE-6226 URL: https://issues.apache.org/jira/browse/LUCENE-6226 Project: Lucene - Core Issue Type: Improvement Reporter: Alan Woodward Assignee: Alan Woodward Fix For: Trunk, 5.1 Attachments: LUCENE-6226.patch, LUCENE-6226.patch, LUCENE-6226.patch, LUCENE-6226.patch, LUCENE-6226.patch, LUCENE-6226.patch, LUCENE-6226.patch This change will allow Scorers to expose which positions within a document they have matched, via a new IntervalIterator interface. Consumers get the iterator by calling intervals() on the Scorer, then call reset(docId) whenever the scorer has advanced and nextInterval() to iterate through positions. Once all matching intervals on the current document have been exhausted, nextInterval() returns false. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-7256) Multiple data dirs
[ https://issues.apache.org/jira/browse/SOLR-7256?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14371178#comment-14371178 ] Hari Sekhon commented on SOLR-7256: --- Btw Elasticsearch has multiple data dirs so I replaced my SolrCloud deployment with Elasticsearch yesterday as it solved this data distribution and other issues around scaling. Multiple data dirs -- Key: SOLR-7256 URL: https://issues.apache.org/jira/browse/SOLR-7256 Project: Solr Issue Type: New Feature Affects Versions: 4.10.3 Environment: HDP 2.2 / HDP Search Reporter: Hari Sekhon Request to support multiple dataDirs as indexing a large collection fills up only one of many disks in modern servers (think colocating on Hadoop servers with many disks). While HDFS is another alternative, it results in poor performance and index corruption under high online indexing loads (SOLR-7255). While it should be possible to do multiple cores with different dataDirs, that could be very difficult to manage and not humanly scale well, so I think Solr should support use of multiple dataDirs natively. Regards, Hari Sekhon http://www.linkedin.com/in/harisekhon -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-5.x-Linux (64bit/jdk1.8.0_40) - Build # 11859 - Failure!
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-5.x-Linux/11859/ Java: 64bit/jdk1.8.0_40 -XX:+UseCompressedOops -XX:+UseParallelGC 1 tests failed. FAILED: org.apache.solr.cloud.FullSolrCloudDistribCmdsTest.test Error Message: Error from server at http://127.0.0.1:56868/_tvx/cs/compositeid_collection_with_routerfield_shard1_replica1: no servers hosting shard: Stack Trace: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:56868/_tvx/cs/compositeid_collection_with_routerfield_shard1_replica1: no servers hosting shard: at __randomizedtesting.SeedInfo.seed([59DCBBDF2EFCDA62:D18884058000B79A]:0) at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:584) at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:236) at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:228) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:135) at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:943) at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:958) at org.apache.solr.cloud.FullSolrCloudDistribCmdsTest.testDeleteByIdCompositeRouterWithRouterField(FullSolrCloudDistribCmdsTest.java:357) at org.apache.solr.cloud.FullSolrCloudDistribCmdsTest.test(FullSolrCloudDistribCmdsTest.java:146) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:958) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:933) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at
Re: [JENKINS] Lucene-Solr-5.x-Linux (32bit/jdk1.9.0-ea-b54) - Build # 11848 - Failure!
Thanks Uwe, we will update you with the bug id . Rgds,Rory On 19/03/2015 16:28, Uwe Schindler wrote: Hi, I opened Review ID: JI-9019884 Java 9 b54 breaks compiling code with source/target 1.7 and diamond operator Uwe - Uwe Schindler H.-H.-Meier-Allee 63, D-28213 Bremen http://www.thetaphi.de eMail: u...@thetaphi.de -Original Message- From: Rory O'Donnell [mailto:rory.odonn...@oracle.com] Sent: Thursday, March 19, 2015 4:12 PM To: Uwe Schindler; dev@lucene.apache.org Cc: rory.odonn...@oracle.com; Dalibor Topic; Balchandra Vaidya Subject: Re: [JENKINS] Lucene-Solr-5.x-Linux (32bit/jdk1.9.0-ea-b54) - Build # 11848 - Failure! On 19/03/2015 14:30, Uwe Schindler wrote: Hi, this seems to be a bug (or feature?) in the most recent Java 9 build 54: compile-core: [mkdir] Created dir: /home/jenkins/workspace/Lucene-Solr-5.x- Linux/lucene/build/analysis/common/classes/java [javac] Compiling 461 source files to /home/jenkins/workspace/Lucene- Solr-5.x-Linux/lucene/build/analysis/common/classes/java [javac] /home/jenkins/workspace/Lucene-Solr-5.x- Linux/lucene/analysis/common/src/java/org/apache/lucene/analysis/util/Ch arArrayMap.java:568: error: incompatible types: CharArrayMapCAP#1 cannot be converted to CharArrayMapV [javac] return new CharArrayMap(map, false); [javac]^ [javac] where V is a type-variable: [javac] V extends Object declared in method Vcopy(Map?,? extends V) [javac] where CAP#1 is a fresh type-variable: [javac] CAP#1 extends V from capture of ? extends V This is the code: @SuppressWarnings(unchecked) public static V CharArrayMapV copy(final Map?,? extends V map) { if(map == EMPTY_MAP) return emptyMap(); if(map instanceof CharArrayMap) { CharArrayMapV m = (CharArrayMapV) map; // use fast path instead of iterating all values // this is even on very small sets ~10 times faster than iterating final char[][] keys = new char[m.keys.length][]; System.arraycopy(m.keys, 0, keys, 0, keys.length); final V[] values = (V[]) new Object[m.values.length]; System.arraycopy(m.values, 0, values, 0, values.length); m = new CharArrayMap(m); m.keys = keys; m.values = values; return m; } return new CharArrayMap(map, false); } At least this breaks compiling existing code. Rory, should I open a bug report with an example code? Hi Uwe, Please do log a bug. Rgds,Rory Uwe - Uwe Schindler H.-H.-Meier-Allee 63, D-28213 Bremen http://www.thetaphi.de eMail: u...@thetaphi.de -Original Message- From: Policeman Jenkins Server [mailto:jenk...@thetaphi.de] Sent: Thursday, March 19, 2015 1:15 PM To: dev@lucene.apache.org Subject: [JENKINS] Lucene-Solr-5.x-Linux (32bit/jdk1.9.0-ea-b54) - Build # 11848 - Failure! Build: http://jenkins.thetaphi.de/job/Lucene-Solr-5.x-Linux/11848/ Java: 32bit/jdk1.9.0-ea-b54 -server -XX:+UseConcMarkSweepGC All tests passed Build Log: [...truncated 1899 lines...] [javac] Compiling 461 source files to /home/jenkins/workspace/Lucene- Solr-5.x- Linux/lucene/build/analysis/common/classes/java [javac] /home/jenkins/workspace/Lucene-Solr-5.x- Linux/lucene/analysis/common/src/java/org/apache/lucene/analysis/util /Ch arArrayMap.java:568: error: incompatible types: CharArrayMapCAP#1 cannot be converted to CharArrayMapV [javac] return new CharArrayMap(map, false); [javac]^ [javac] where V is a type-variable: [javac] V extends Object declared in method Vcopy(Map?,? extends V) [javac] where CAP#1 is a fresh type-variable: [javac] CAP#1 extends V from capture of ? extends V [javac] /home/jenkins/workspace/Lucene-Solr-5.x- Linux/lucene/analysis/common/src/java/org/apache/lucene/analysis/huns p ell/Stemmer.java:270: warning: [rawtypes] found raw type: Arc [javac] final FST.ArcIntsRef prefixArcs[] = new FST.Arc[3]; [javac]^ [javac] missing type arguments for generic class ArcT [javac] where T is a type-variable: [javac] T extends Object declared in class Arc [javac] /home/jenkins/workspace/Lucene-Solr-5.x- Linux/lucene/analysis/common/src/java/org/apache/lucene/analysis/huns p ell/Stemmer.java:274: warning: [rawtypes] found raw type: Arc [javac] final FST.ArcIntsRef suffixArcs[] = new FST.Arc[3]; [javac]^ [javac] missing type arguments for generic class ArcT [javac] where T is a type-variable: [javac] T extends Object declared in class Arc [javac] /home/jenkins/workspace/Lucene-Solr-5.x- Linux/lucene/analysis/common/src/java/org/tartarus/snowball/Among.java: 46: warning: [rawtypes] found raw type: Class [javac] private static final Class?[] EMPTY_PARAMS = new Class[0];
[JENKINS] Lucene-Solr-5.x-MacOSX (64bit/jdk1.8.0) - Build # 2028 - Failure!
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-5.x-MacOSX/2028/ Java: 64bit/jdk1.8.0 -XX:-UseCompressedOops -XX:+UseG1GC 1 tests failed. FAILED: org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions Error Message: unexpected map keys [e] @ response/docs/[0] Stack Trace: java.lang.RuntimeException: unexpected map keys [e] @ response/docs/[0] at __randomizedtesting.SeedInfo.seed([2BA203EDCC552176:7598E190F0C3DD3A]:0) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:882) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:829) at org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions(TestFunctionQuery.java:739) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:745) Build Log: [...truncated 20124 lines...] [junit4] Suite: org.apache.solr.search.function.TestFunctionQuery [junit4] 2 Creating dataDir:
[JENKINS] Lucene-Solr-5.x-Linux (64bit/jdk1.9.0-ea-b54) - Build # 11865 - Still Failing!
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-5.x-Linux/11865/ Java: 64bit/jdk1.9.0-ea-b54 -XX:-UseCompressedOops -XX:+UseSerialGC 1 tests failed. FAILED: org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions Error Message: unexpected map keys [e] @ response/docs/[0] Stack Trace: java.lang.RuntimeException: unexpected map keys [e] @ response/docs/[0] at __randomizedtesting.SeedInfo.seed([F76E6565B053A8EA:A95487188CC554A6]:0) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:882) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:829) at org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions(TestFunctionQuery.java:739) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:502) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:745) Build Log: [...truncated 9908 lines...] [junit4] Suite: org.apache.solr.search.function.TestFunctionQuery [junit4] 2 Creating dataDir:
[JENKINS] Lucene-Solr-trunk-Linux (64bit/jdk1.8.0_60-ea-b06) - Build # 12029 - Still Failing!
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-trunk-Linux/12029/ Java: 64bit/jdk1.8.0_60-ea-b06 -XX:+UseCompressedOops -XX:+UseSerialGC 1 tests failed. FAILED: org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions Error Message: unexpected map keys [e] @ response/docs/[0] Stack Trace: java.lang.RuntimeException: unexpected map keys [e] @ response/docs/[0] at __randomizedtesting.SeedInfo.seed([E920AC7E46B8C7CB:B71A4E037A2E3B87]:0) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:882) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:829) at org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions(TestFunctionQuery.java:739) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:745) Build Log: [...truncated 9506 lines...] [junit4] Suite: org.apache.solr.search.function.TestFunctionQuery [junit4] 2 Creating dataDir:
[JENKINS] Lucene-Solr-5.x-Windows (64bit/jdk1.8.0_40) - Build # 4455 - Failure!
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-5.x-Windows/4455/ Java: 64bit/jdk1.8.0_40 -XX:+UseCompressedOops -XX:+UseG1GC 2 tests failed. FAILED: junit.framework.TestSuite.org.apache.solr.core.TestSolrConfigHandler Error Message: Could not remove the following files (in the order of attempts): C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 3184DDE6E629C6E6-001\tempDir-010\collection1\conf\params.json: java.nio.file.FileSystemException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 3184DDE6E629C6E6-001\tempDir-010\collection1\conf\params.json: The process cannot access the file because it is being used by another process. C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 3184DDE6E629C6E6-001\tempDir-010\collection1\conf: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 3184DDE6E629C6E6-001\tempDir-010\collection1\conf C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 3184DDE6E629C6E6-001\tempDir-010\collection1: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 3184DDE6E629C6E6-001\tempDir-010\collection1 C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 3184DDE6E629C6E6-001\tempDir-010: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 3184DDE6E629C6E6-001\tempDir-010 C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 3184DDE6E629C6E6-001: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 3184DDE6E629C6E6-001 Stack Trace: java.io.IOException: Could not remove the following files (in the order of attempts): C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 3184DDE6E629C6E6-001\tempDir-010\collection1\conf\params.json: java.nio.file.FileSystemException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 3184DDE6E629C6E6-001\tempDir-010\collection1\conf\params.json: The process cannot access the file because it is being used by another process. C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 3184DDE6E629C6E6-001\tempDir-010\collection1\conf: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 3184DDE6E629C6E6-001\tempDir-010\collection1\conf C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 3184DDE6E629C6E6-001\tempDir-010\collection1: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 3184DDE6E629C6E6-001\tempDir-010\collection1 C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 3184DDE6E629C6E6-001\tempDir-010: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 3184DDE6E629C6E6-001\tempDir-010 C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 3184DDE6E629C6E6-001: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 3184DDE6E629C6E6-001 at __randomizedtesting.SeedInfo.seed([3184DDE6E629C6E6]:0) at org.apache.lucene.util.IOUtils.rm(IOUtils.java:294) at org.apache.lucene.util.TestRuleTemporaryFilesCleanup.afterAlways(TestRuleTemporaryFilesCleanup.java:200) at com.carrotsearch.randomizedtesting.rules.TestRuleAdapter$1.afterAlways(TestRuleAdapter.java:31) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:43) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at
[JENKINS] Lucene-Solr-5.x-Linux (32bit/jdk1.7.0_76) - Build # 11866 - Still Failing!
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-5.x-Linux/11866/ Java: 32bit/jdk1.7.0_76 -client -XX:+UseParallelGC 1 tests failed. FAILED: org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions Error Message: unexpected map keys [e] @ response/docs/[0] Stack Trace: java.lang.RuntimeException: unexpected map keys [e] @ response/docs/[0] at __randomizedtesting.SeedInfo.seed([EE346A9408D44397:B00E88E93442BFDB]:0) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:882) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:829) at org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions(TestFunctionQuery.java:739) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:745) Build Log: [...truncated 10029 lines...] [junit4] Suite: org.apache.solr.search.function.TestFunctionQuery [junit4] 2 Creating dataDir:
[JENKINS] Lucene-Solr-trunk-MacOSX (64bit/jdk1.8.0) - Build # 2072 - Failure!
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-trunk-MacOSX/2072/ Java: 64bit/jdk1.8.0 -XX:-UseCompressedOops -XX:+UseSerialGC 1 tests failed. FAILED: org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions Error Message: unexpected map keys [e] @ response/docs/[0] Stack Trace: java.lang.RuntimeException: unexpected map keys [e] @ response/docs/[0] at __randomizedtesting.SeedInfo.seed([B672401FF0EC5C1A:E848A262CC7AA056]:0) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:882) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:829) at org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions(TestFunctionQuery.java:739) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:745) Build Log: [...truncated 20025 lines...] [junit4] Suite: org.apache.solr.search.function.TestFunctionQuery [junit4] 2 Creating dataDir:
[JENKINS] Lucene-Solr-Tests-5.x-Java7 - Build # 2807 - Still Failing
Build: https://builds.apache.org/job/Lucene-Solr-Tests-5.x-Java7/2807/ 4 tests failed. REGRESSION: org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions Error Message: unexpected map keys [e] @ response/docs/[0] Stack Trace: java.lang.RuntimeException: unexpected map keys [e] @ response/docs/[0] at __randomizedtesting.SeedInfo.seed([FD655FAB80F513F4:A35FBDD6BC63EFB8]:0) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:882) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:829) at org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions(TestFunctionQuery.java:739) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:745) FAILED: org.apache.solr.cloud.LeaderInitiatedRecoveryOnCommitTest.test Error Message: IOException occured when talking to server at: http://127.0.0.1:45323/c8n_1x3_commits_shard1_replica2 Stack Trace:
[JENKINS] Lucene-Solr-trunk-Linux (32bit/jdk1.8.0_40) - Build # 12030 - Still Failing!
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-trunk-Linux/12030/ Java: 32bit/jdk1.8.0_40 -server -XX:+UseSerialGC 1 tests failed. FAILED: org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions Error Message: unexpected map keys [e] @ response/docs/[0] Stack Trace: java.lang.RuntimeException: unexpected map keys [e] @ response/docs/[0] at __randomizedtesting.SeedInfo.seed([8555035729BFEC82:DB6FE12A152910CE]:0) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:882) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:829) at org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions(TestFunctionQuery.java:739) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:745) Build Log: [...truncated 9210 lines...] [junit4] Suite: org.apache.solr.search.function.TestFunctionQuery [junit4] 2 Creating dataDir:
[JENKINS] Lucene-Solr-SmokeRelease-5.x - Build # 246 - Failure
Build: https://builds.apache.org/job/Lucene-Solr-SmokeRelease-5.x/246/ No tests ran. Build Log: [...truncated 52204 lines...] prepare-release-no-sign: [mkdir] Created dir: /usr/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-5.x/lucene/build/smokeTestRelease/dist [copy] Copying 446 files to /usr/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-5.x/lucene/build/smokeTestRelease/dist/lucene [copy] Copying 245 files to /usr/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-5.x/lucene/build/smokeTestRelease/dist/solr [smoker] Java 1.7 JAVA_HOME=/home/jenkins/tools/java/latest1.7 [smoker] NOTE: output encoding is US-ASCII [smoker] [smoker] Load release URL file:/usr/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-5.x/lucene/build/smokeTestRelease/dist/... [smoker] [smoker] Test Lucene... [smoker] test basics... [smoker] get KEYS [smoker] 0.1 MB in 0.01 sec (9.6 MB/sec) [smoker] check changes HTML... [smoker] download lucene-5.1.0-src.tgz... [smoker] 28.1 MB in 0.04 sec (675.9 MB/sec) [smoker] verify md5/sha1 digests [smoker] download lucene-5.1.0.tgz... [smoker] 64.5 MB in 0.09 sec (691.3 MB/sec) [smoker] verify md5/sha1 digests [smoker] download lucene-5.1.0.zip... [smoker] 74.1 MB in 0.14 sec (512.4 MB/sec) [smoker] verify md5/sha1 digests [smoker] unpack lucene-5.1.0.tgz... [smoker] verify JAR metadata/identity/no javax.* or java.* classes... [smoker] test demo with 1.7... [smoker] got 5693 hits for query lucene [smoker] checkindex with 1.7... [smoker] check Lucene's javadoc JAR [smoker] unpack lucene-5.1.0.zip... [smoker] verify JAR metadata/identity/no javax.* or java.* classes... [smoker] test demo with 1.7... [smoker] got 5693 hits for query lucene [smoker] checkindex with 1.7... [smoker] check Lucene's javadoc JAR [smoker] unpack lucene-5.1.0-src.tgz... [smoker] make sure no JARs/WARs in src dist... [smoker] run ant validate [smoker] run tests w/ Java 7 and testArgs='-Dtests.jettyConnector=Socket -Dtests.multiplier=1 -Dtests.slow=false'... [smoker] test demo with 1.7... [smoker] got 209 hits for query lucene [smoker] checkindex with 1.7... [smoker] generate javadocs w/ Java 7... [smoker] [smoker] Crawl/parse... [smoker] [smoker] Verify... [smoker] confirm all releases have coverage in TestBackwardsCompatibility [smoker] find all past Lucene releases... [smoker] run TestBackwardsCompatibility.. [smoker] success! [smoker] [smoker] Test Solr... [smoker] test basics... [smoker] get KEYS [smoker] 0.1 MB in 0.01 sec (10.7 MB/sec) [smoker] check changes HTML... [smoker] download solr-5.1.0-src.tgz... [smoker] 35.8 MB in 0.09 sec (383.9 MB/sec) [smoker] verify md5/sha1 digests [smoker] download solr-5.1.0.tgz... [smoker] 123.3 MB in 0.27 sec (452.2 MB/sec) [smoker] verify md5/sha1 digests [smoker] download solr-5.1.0.zip... [smoker] 129.7 MB in 0.55 sec (234.5 MB/sec) [smoker] verify md5/sha1 digests [smoker] unpack solr-5.1.0.tgz... [smoker] verify JAR metadata/identity/no javax.* or java.* classes... [smoker] unpack lucene-5.1.0.tgz... [smoker] **WARNING**: skipping check of /usr/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-5.x/lucene/build/smokeTestRelease/tmp/unpack/solr-5.1.0/contrib/dataimporthandler-extras/lib/javax.mail-1.5.1.jar: it has javax.* classes [smoker] **WARNING**: skipping check of /usr/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-5.x/lucene/build/smokeTestRelease/tmp/unpack/solr-5.1.0/contrib/dataimporthandler-extras/lib/activation-1.1.1.jar: it has javax.* classes [smoker] verify WAR metadata/contained JAR identity/no javax.* or java.* classes... [smoker] unpack lucene-5.1.0.tgz... [smoker] copying unpacked distribution for Java 7 ... [smoker] test solr example w/ Java 7... [smoker] start Solr instance (log=/usr/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-5.x/lucene/build/smokeTestRelease/tmp/unpack/solr-5.1.0-java7/solr-example.log)... [smoker] No process found for Solr node running on port 8983 [smoker] starting Solr on port 8983 from /usr/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-5.x/lucene/build/smokeTestRelease/tmp/unpack/solr-5.1.0-java7 [smoker] startup done [smoker] [smoker] Setup new core instance directory: [smoker] /usr/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-5.x/lucene/build/smokeTestRelease/tmp/unpack/solr-5.1.0-java7/server/solr/techproducts [smoker] [smoker] Creating new core 'techproducts' using command: [smoker]
[JENKINS] Lucene-Solr-5.x-Linux (32bit/jdk1.9.0-ea-b54) - Build # 11867 - Still Failing!
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-5.x-Linux/11867/ Java: 32bit/jdk1.9.0-ea-b54 -client -XX:+UseParallelGC 1 tests failed. FAILED: org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions Error Message: unexpected map keys [e] @ response/docs/[0] Stack Trace: java.lang.RuntimeException: unexpected map keys [e] @ response/docs/[0] at __randomizedtesting.SeedInfo.seed([766D9D0E7A176081:28577F7346819CCD]:0) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:882) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:829) at org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions(TestFunctionQuery.java:739) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:502) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:745) Build Log: [...truncated 10124 lines...] [junit4] Suite: org.apache.solr.search.function.TestFunctionQuery [junit4] 2 Creating dataDir:
[JENKINS] Lucene-Solr-5.x-Linux (32bit/jdk1.8.0_60-ea-b06) - Build # 11860 - Still Failing!
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-5.x-Linux/11860/ Java: 32bit/jdk1.8.0_60-ea-b06 -server -XX:+UseG1GC 1 tests failed. FAILED: org.apache.solr.handler.TestSolrConfigHandlerCloud.test Error Message: Could not get expected value 'CY val modified' for path 'response/params/y/c' full output: { responseHeader:{ status:0, QTime:0}, response:{ znodeVersion:1, params:{ x:{ a:A val, b:B val, :{v:0}}, y:{ c:CY val, b:BY val, i:20, d:[ val 1, val 2], :{v:0} Stack Trace: java.lang.AssertionError: Could not get expected value 'CY val modified' for path 'response/params/y/c' full output: { responseHeader:{ status:0, QTime:0}, response:{ znodeVersion:1, params:{ x:{ a:A val, b:B val, :{v:0}}, y:{ c:CY val, b:BY val, i:20, d:[ val 1, val 2], :{v:0} at __randomizedtesting.SeedInfo.seed([136FA654EC50EDC8:9B3B998E42AC8030]:0) at org.junit.Assert.fail(Assert.java:93) at org.junit.Assert.assertTrue(Assert.java:43) at org.apache.solr.core.TestSolrConfigHandler.testForResponseElement(TestSolrConfigHandler.java:399) at org.apache.solr.handler.TestSolrConfigHandlerCloud.testReqParams(TestSolrConfigHandlerCloud.java:224) at org.apache.solr.handler.TestSolrConfigHandlerCloud.test(TestSolrConfigHandlerCloud.java:78) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:958) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:933) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
[jira] [Updated] (SOLR-7162) Remove unused SolrSortField interface
[ https://issues.apache.org/jira/browse/SOLR-7162?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Connor Warrington updated SOLR-7162: Attachment: SOLR-7162.patch Remove unused SolrSortField Remove unused SolrSortField interface - Key: SOLR-7162 URL: https://issues.apache.org/jira/browse/SOLR-7162 Project: Solr Issue Type: Task Reporter: Shalin Shekhar Mangar Priority: Trivial Fix For: Trunk, 5.1 Attachments: SOLR-7162.patch SortSortField is an unused interface. I can't find any uses in our project. It is also marked as lucene.experimental. Let's nuke it. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
Re: jhighlight-1.0 contains LGPL-only files
I suspect that the classes in question are in fact *not* used by Tika in any capacity, but they are in the jar nonetheless. So one solution would be to simply repackage the jar. I'd like to see what the Tika team says. Karl On Fri, Mar 20, 2015 at 7:47 AM, Shai Erera ser...@gmail.com wrote: although it might be easier to work with Tika to fix that and then upgrade again. If jhighlight was brought into Solr distribution as a transitive dependency then you're right, but since we pull it in explicitly (even if for runtime purposes only), I think we should remove it, whether Tika corrects the problem or not. We can put a note in our NOTICE file for users to download the jar themselves until Tika fixes the problem. If people agree, I will remove it from our code. Shai On Fri, Mar 20, 2015 at 10:40 AM, Karl Wright daddy...@gmail.com wrote: I have created a ticket: TIKA-1581. ManifoldCF also has a Tika dependency, so thank you for noting the problem. Karl On Fri, Mar 20, 2015 at 4:03 AM, Upayavira u...@odoko.co.uk wrote: You are right -both projects need to remove it, although it might be easier to work with Tika to fix that and then upgrade again. Upayavira On Fri, Mar 20, 2015, at 05:26 AM, Shai Erera wrote: Sorry for the spam, just wanted to note that this dependency was added by Steve in SOLR-6130 to resolve improper Tika 1.4-1.5 upgrade. The core issue lies with Tika IMO (they shouldn't rely on LGPL code too I believe), but I am not sure if it's OK that we distribute this .jar ourselves. Shai On Fri, Mar 20, 2015 at 7:17 AM, Shai Erera ser...@gmail.com wrote: One update, I did find this dependency is explicitly set in solr/contrib/extraction/ivy.xml, under the Tika dependencies section: !-- Tika dependencies - see http://tika.apache.org/1.3/gettingstarted.html#Using_Tika_as_a_Maven_dependency -- !-- When upgrading Tika, upgrade dependencies versions and add any new ones (except slf4j-api, commons-codec, commons-logging, commons-httpclient, geronimo-stax-api_1.0_spec, jcip-annotations, xml-apis, asm) WARNING: Don't add netcdf / unidataCommon (partially LGPL code) -- ... dependency org=com.uwyn name=jhighlight rev=${/com.uwyn/jhighlight} conf=compile/ So it does seem like needed by Tika only and I guess it's a runtime dependency, so if we don't want to release this LGPL library, we can omit it and put a section in the NOTICE file? Shai On Fri, Mar 20, 2015 at 7:11 AM, Shai Erera ser...@gmail.com wrote: Hi Solr's contrib/extraction contains jhighlight-1.0.jar which declares itself as dual CDDL or LGPL license. However, some of its classes are distributed only under LGPL, e.g. com.uwyn.jhighlight.highlighter. CppHighlighter.java GroovyHighlighter.java JavaHighlighter.java XmlHighlighter.java I downloaded the sources from Maven ( http://search.maven.org/remotecontent?filepath=com/uwyn/jhighlight/1.0/jhighlight-1.0-sources.jar ) to confirm that, and also found this SVN repo: http://svn.rifers.org/jhighlight/tags/release-1.0, though the project's website seems to not exist anymore ( https://jhighlight.dev.java.net/). I didn't find any direct usage of it in our code, so I guess it's probably needed by a 3rd party dependency, such as Tika. Therefore if we e.g. omit it, things will compile, but may fail at runtime. Is it OK that we distribute this .jar? Shai - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-7245) Temporary ZK election or connection loss should not stall indexing due to LIR
[ https://issues.apache.org/jira/browse/SOLR-7245?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14371337#comment-14371337 ] Mark Miller commented on SOLR-7245: --- The ChaosMonkey tests could hit such a case, but there is no real guarantee - those tests need love to make sure they hit everything we hope they hit. [~tim.potter], any chance you can take a gander at this? Temporary ZK election or connection loss should not stall indexing due to LIR - Key: SOLR-7245 URL: https://issues.apache.org/jira/browse/SOLR-7245 Project: Solr Issue Type: Improvement Components: SolrCloud Reporter: Ramkumar Aiyengar Assignee: Ramkumar Aiyengar Priority: Minor Attachments: SOLR-7245.patch, SOLR-7245.patch If there's a ZK election or connection loss, and the leader is unable to reach a replica, it currently would stall till the ZK connection is established, due to the LIR process. This shouldn't happen, and in some way regresses the work done in SOLR-5577. I will try get to this, but if someone races me to it, feel free to.. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-Tests-5.x-Java7 - Build # 2804 - Still Failing
Build: https://builds.apache.org/job/Lucene-Solr-Tests-5.x-Java7/2804/ 3 tests failed. FAILED: org.apache.solr.cloud.LeaderInitiatedRecoveryOnCommitTest.test Error Message: IOException occured when talking to server at: http://127.0.0.1:48376/kg_zqy/dy/c8n_1x3_commits_shard1_replica1 Stack Trace: org.apache.solr.client.solrj.SolrServerException: IOException occured when talking to server at: http://127.0.0.1:48376/kg_zqy/dy/c8n_1x3_commits_shard1_replica1 at __randomizedtesting.SeedInfo.seed([4A0FAE0585022144:C25B91DF2BFE4CBC]:0) at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:598) at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:236) at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:228) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:135) at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:483) at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:464) at org.apache.solr.cloud.LeaderInitiatedRecoveryOnCommitTest.oneShardTest(LeaderInitiatedRecoveryOnCommitTest.java:130) at org.apache.solr.cloud.LeaderInitiatedRecoveryOnCommitTest.test(LeaderInitiatedRecoveryOnCommitTest.java:62) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:958) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:933) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at
[jira] [Updated] (SOLR-7276) Add a Boolean Post Filter QParserPlugin
[ https://issues.apache.org/jira/browse/SOLR-7276?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ted Sullivan updated SOLR-7276: --- Description: This plugin enables existing post filter implementations to be combined in using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: fq={!bool expr=(($foo OR $bar) NOT $baz)}amp;foo={!foo ...}amp;bar={!bar ... }amp;baz={!baz ...} Where foo, bar and baz are all post filters. was: This plugin enables existing post filter implementations to be combined in using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: pre fq={!bool expr=(($foo OR $bar) NOT $baz)}amp;foo={!foo ...}amp;bar={!bar ... }amp;baz={!baz ...} /pre Where foo, bar and baz are all post filters. Add a Boolean Post Filter QParserPlugin --- Key: SOLR-7276 URL: https://issues.apache.org/jira/browse/SOLR-7276 Project: Solr Issue Type: New Feature Reporter: Ted Sullivan This plugin enables existing post filter implementations to be combined in using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: fq={!bool expr=(($foo OR $bar) NOT $baz)}amp;foo={!foo ...}amp;bar={!bar ... }amp;baz={!baz ...} Where foo, bar and baz are all post filters. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (SOLR-7276) Add a Boolean Post Filter QParserPlugin
[ https://issues.apache.org/jira/browse/SOLR-7276?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ted Sullivan updated SOLR-7276: --- Description: This plugin enables existing post filter implementations to be combined in using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: {noformat} fq={!bool expr=(($foo OR $bar) NOT $baz)}foo={!foo ...}bar={!bar ... }baz={!baz ...} {noformat} Where foo, bar and baz are all post filters. was: This plugin enables existing post filter implementations to be combined in using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: {noformat} fq={!bool expr=(($foo OR $bar) NOT $baz)}amp;foo={!foo ...}amp;bar={!bar ... }amp;baz={!baz ...} {noformat} Where foo, bar and baz are all post filters. Add a Boolean Post Filter QParserPlugin --- Key: SOLR-7276 URL: https://issues.apache.org/jira/browse/SOLR-7276 Project: Solr Issue Type: New Feature Reporter: Ted Sullivan This plugin enables existing post filter implementations to be combined in using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: {noformat} fq={!bool expr=(($foo OR $bar) NOT $baz)}foo={!foo ...}bar={!bar ... }baz={!baz ...} {noformat} Where foo, bar and baz are all post filters. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (SOLR-7276) Add a Boolean Post Filter QParserPlugin
[ https://issues.apache.org/jira/browse/SOLR-7276?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ted Sullivan updated SOLR-7276: --- Description: This plugin enables existing post filter implementations to be combined using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: {noformat} fq={!bool expr=(($foo OR $bar) NOT $baz)}foo={!foo ...}bar={!bar ... }baz={!baz ...} {noformat} Where foo, bar and baz are all post filters. was: This plugin enables existing post filter implementations to be combined in using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: {noformat} fq={!bool expr=(($foo OR $bar) NOT $baz)}foo={!foo ...}bar={!bar ... }baz={!baz ...} {noformat} Where foo, bar and baz are all post filters. Add a Boolean Post Filter QParserPlugin --- Key: SOLR-7276 URL: https://issues.apache.org/jira/browse/SOLR-7276 Project: Solr Issue Type: New Feature Reporter: Ted Sullivan Attachments: SOLR-7276.patch This plugin enables existing post filter implementations to be combined using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: {noformat} fq={!bool expr=(($foo OR $bar) NOT $baz)}foo={!foo ...}bar={!bar ... }baz={!baz ...} {noformat} Where foo, bar and baz are all post filters. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (SOLR-7276) Add a Boolean Post Filter QParserPlugin
[ https://issues.apache.org/jira/browse/SOLR-7276?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ted Sullivan updated SOLR-7276: --- Description: This plugin enables existing post filter implementations to be combined in using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: fq={!bool expr=(($foo OR $bar) NOT $baz)}amp;foo={!foo ...}amp;bar={!bar ... }amp;baz={!baz ...} Where foo, bar and baz are all post filters. was: This plugin enables existing post filter implementations to be combined in using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: fq={!bool expr=(($foo OR $bar) NOT $baz)}lt;foo={!foo ...}lt;bar={!bar ... }lt;baz={!baz ...} Where foo, bar and baz are all post filters. Add a Boolean Post Filter QParserPlugin --- Key: SOLR-7276 URL: https://issues.apache.org/jira/browse/SOLR-7276 Project: Solr Issue Type: New Feature Reporter: Ted Sullivan This plugin enables existing post filter implementations to be combined in using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: fq={!bool expr=(($foo OR $bar) NOT $baz)}amp;foo={!foo ...}amp;bar={!bar ... }amp;baz={!baz ...} Where foo, bar and baz are all post filters. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (SOLR-7276) Add a Boolean Post Filter QParserPlugin
[ https://issues.apache.org/jira/browse/SOLR-7276?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ted Sullivan updated SOLR-7276: --- Description: This plugin enables existing post filter implementations to be combined in using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: fq={!bool expr=(($foo OR $bar) NOT $baz)}lt;foo={!foo ...}lt;bar={!bar ... }lt;baz={!baz ...} Where foo, bar and baz are all post filters. was: This plugin enables existing post filter implementations to be combined in using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: fq={!bool expr=(($foo OR $bar) NOT $baz)}foo={!foo ...}bar={!bar ... }baz={!baz ...} Where foo, bar and baz are all post filters. Add a Boolean Post Filter QParserPlugin --- Key: SOLR-7276 URL: https://issues.apache.org/jira/browse/SOLR-7276 Project: Solr Issue Type: New Feature Reporter: Ted Sullivan This plugin enables existing post filter implementations to be combined in using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: fq={!bool expr=(($foo OR $bar) NOT $baz)}lt;foo={!foo ...}lt;bar={!bar ... }lt;baz={!baz ...} Where foo, bar and baz are all post filters. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Created] (SOLR-7276) Add a Boolean Post Filter QParserPlugin
Ted Sullivan created SOLR-7276: -- Summary: Add a Boolean Post Filter QParserPlugin Key: SOLR-7276 URL: https://issues.apache.org/jira/browse/SOLR-7276 Project: Solr Issue Type: New Feature Reporter: Ted Sullivan This plugin enables existing post filter implementations to be combined in using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: fq={!bool expr=(($foo OR $bar) NOT $baz)}foo={!foo ...}bar={!bar ... }baz={!baz ...} Where foo, bar and baz are all post filters. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-7214) JSON Facet API
[ https://issues.apache.org/jira/browse/SOLR-7214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14371443#comment-14371443 ] Grant Ingersoll commented on SOLR-7214: --- I should add, however, I think the hanging of off approach brings some interesting things to the table in terms of slicing and dicing things, but I admittedly haven't looked deeply at this new stuff. My main concern here isn't the implementation or any one approach, it's the we now have 2 approaches. That's not going to make for a good user experience. I would prefer we resolve the user experience before we commit and release this. JSON Facet API -- Key: SOLR-7214 URL: https://issues.apache.org/jira/browse/SOLR-7214 Project: Solr Issue Type: New Feature Reporter: Yonik Seeley Attachments: SOLR-7214.patch Overview is here: http://yonik.com/json-facet-api/ The structured nature of nested sub-facets are more naturally expressed in a nested structure like JSON rather than the flat structure that normal query parameters provide. Goals: - First class JSON support - Easier programmatic construction of complex nested facet commands - Support a much more canonical response format that is easier for clients to parse - First class analytics support - Support a cleaner way to do distributed faceting - Support better integration with other search features -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5879) Add auto-prefix terms to block tree terms dict
[ https://issues.apache.org/jira/browse/LUCENE-5879?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14371401#comment-14371401 ] Michael McCandless commented on LUCENE-5879: bq. I don't understand why these need to be tied to FixedBitSet I'll cutover to BitSet. bq. Alternatively, code could stay where it is I'll leave it as is (dark addition to BlockTree). bq. in codecs/ we could have AutoPrefixPF that exposes it and make it experimental or something? Good idea! I'll do this... this way Lucene50PF is unchanged. bq. I can't parse this. if you use a scoring rewrite, it still works right? It does work (you get the right hits) ... TestPrefixQuery/TestTermRangeQuery randomly use SCORING_BOOLEAN_REWRITE and CONSTANT_SCORE_BOOLEAN_REWRITE. bq. Its just that the generated termqueries will contain pseudo-terms, but their statistics etc are all correct? Right: they will use the auto-prefix terms, which have correct stats (i.e. docFreq is number of docs containing any term with this prefix). Is this too weird? It means you get different scores than you get today... We could maybe turn off auto-prefix if you use these rewrite methods? But this would need an API change to Terms, e.g. a new boolean allowAutoPrefix to Terms.intersect. bq. I definitely understand the RANGE case, its difficult to make the equiv automaton. It's not so bad; I added Operations.makeBinaryInterval in the patch for this. It's like the decimal ranges that Automata.makeInterval already does. bq. Why not just make PrefixQuery subclass AutomatonQuery? I explored this, but it turns out to be tricky, for those PFs that don't have auto prefix terms (use block tree)... I.e., with the patch as it is now, PFs like SimpleText will use a PrefixTermsEnum for PrefixQuery, but if I fix PrefixQuery to subclass AutomatonQuery (and remove AUTOMATON_TYPE.PREFIX) then SimpleText would use AutomatonTermsEnum (on a prefix automaton) which I think will be somewhat less efficient? Maybe it's not so bad in practice? ATE would realize it's in a linear part of the automaton... Maybe we can somehow simplify things here ... I agree both PrefixQuery and TermRangeQuery should ideally just subclass AutomatonQuery. Add auto-prefix terms to block tree terms dict -- Key: LUCENE-5879 URL: https://issues.apache.org/jira/browse/LUCENE-5879 Project: Lucene - Core Issue Type: New Feature Components: core/codecs Reporter: Michael McCandless Assignee: Michael McCandless Fix For: 5.0, Trunk Attachments: LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch This cool idea to generalize numeric/trie fields came from Adrien: Today, when we index a numeric field (LongField, etc.) we pre-compute (via NumericTokenStream) outside of indexer/codec which prefix terms should be indexed. But this can be inefficient: you set a static precisionStep, and always add those prefix terms regardless of how the terms in the field are actually distributed. Yet typically in real world applications the terms have a non-random distribution. So, it should be better if instead the terms dict decides where it makes sense to insert prefix terms, based on how dense the terms are in each region of term space. This way we can speed up query time for both term (e.g. infix suggester) and numeric ranges, and it should let us use less index space and get faster range queries. This would also mean that min/maxTerm for a numeric field would now be correct, vs today where the externally computed prefix terms are placed after the full precision terms, causing hairy code like NumericUtils.getMaxInt/Long. So optos like LUCENE-5860 become feasible. The terms dict can also do tricks not possible if you must live on top of its APIs, e.g. to handle the adversary/over-constrained case when a given prefix has too many terms following it but finer prefixes have too few (what block tree calls floor term blocks). -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-7214) JSON Facet API
[ https://issues.apache.org/jira/browse/SOLR-7214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14371421#comment-14371421 ] Grant Ingersoll commented on SOLR-7214: --- Yeah, I'm not a big fan of local params, so I'm all for a new approach to the API. We should work to consolidate and deprecate, while leveraging what we can under the hood. JSON Facet API -- Key: SOLR-7214 URL: https://issues.apache.org/jira/browse/SOLR-7214 Project: Solr Issue Type: New Feature Reporter: Yonik Seeley Attachments: SOLR-7214.patch Overview is here: http://yonik.com/json-facet-api/ The structured nature of nested sub-facets are more naturally expressed in a nested structure like JSON rather than the flat structure that normal query parameters provide. Goals: - First class JSON support - Easier programmatic construction of complex nested facet commands - Support a much more canonical response format that is easier for clients to parse - First class analytics support - Support a cleaner way to do distributed faceting - Support better integration with other search features -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (SOLR-7276) Add a Boolean Post Filter QParserPlugin
[ https://issues.apache.org/jira/browse/SOLR-7276?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ted Sullivan updated SOLR-7276: --- Description: This plugin enables existing post filter implementations to be combined in using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: pre fq={!bool expr=(($foo OR $bar) NOT $baz)}amp;foo={!foo ...}amp;bar={!bar ... }amp;baz={!baz ...} /pre Where foo, bar and baz are all post filters. was: This plugin enables existing post filter implementations to be combined in using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: fq={!bool expr=(($foo OR $bar) NOT $baz)}amp;foo={!foo ...}amp;bar={!bar ... }amp;baz={!baz ...} Where foo, bar and baz are all post filters. Add a Boolean Post Filter QParserPlugin --- Key: SOLR-7276 URL: https://issues.apache.org/jira/browse/SOLR-7276 Project: Solr Issue Type: New Feature Reporter: Ted Sullivan This plugin enables existing post filter implementations to be combined in using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: pre fq={!bool expr=(($foo OR $bar) NOT $baz)}amp;foo={!foo ...}amp;bar={!bar ... }amp;baz={!baz ...} /pre Where foo, bar and baz are all post filters. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (SOLR-7276) Add a Boolean Post Filter QParserPlugin
[ https://issues.apache.org/jira/browse/SOLR-7276?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ted Sullivan updated SOLR-7276: --- Description: This plugin enables existing post filter implementations to be combined in using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: {noformat} fq={!bool expr=(($foo OR $bar) NOT $baz)}amp;foo={!foo ...}amp;bar={!bar ... }amp;baz={!baz ...} {noformat} Where foo, bar and baz are all post filters. was: This plugin enables existing post filter implementations to be combined in using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: fq={!bool expr=(($foo OR $bar) NOT $baz)}amp;foo={!foo ...}amp;bar={!bar ... }amp;baz={!baz ...} Where foo, bar and baz are all post filters. Add a Boolean Post Filter QParserPlugin --- Key: SOLR-7276 URL: https://issues.apache.org/jira/browse/SOLR-7276 Project: Solr Issue Type: New Feature Reporter: Ted Sullivan This plugin enables existing post filter implementations to be combined in using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: {noformat} fq={!bool expr=(($foo OR $bar) NOT $baz)}amp;foo={!foo ...}amp;bar={!bar ... }amp;baz={!baz ...} {noformat} Where foo, bar and baz are all post filters. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-5.x-Linux (64bit/jdk1.9.0-ea-b54) - Build # 11869 - Still Failing!
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-5.x-Linux/11869/ Java: 64bit/jdk1.9.0-ea-b54 -XX:+UseCompressedOops -XX:+UseSerialGC 1 tests failed. FAILED: org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions Error Message: unexpected map keys [e] @ response/docs/[0] Stack Trace: java.lang.RuntimeException: unexpected map keys [e] @ response/docs/[0] at __randomizedtesting.SeedInfo.seed([54F40F8ECB60566C:ACEEDF3F7F6AA20]:0) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:882) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:829) at org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions(TestFunctionQuery.java:739) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:502) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:745) Build Log: [...truncated 9685 lines...] [junit4] Suite: org.apache.solr.search.function.TestFunctionQuery [junit4] 2 Creating dataDir:
[JENKINS] Lucene-Solr-5.x-Linux (32bit/jdk1.9.0-ea-b54) - Build # 11868 - Still Failing!
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-5.x-Linux/11868/ Java: 32bit/jdk1.9.0-ea-b54 -server -XX:+UseG1GC 1 tests failed. FAILED: org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions Error Message: unexpected map keys [e] @ response/docs/[0] Stack Trace: java.lang.RuntimeException: unexpected map keys [e] @ response/docs/[0] at __randomizedtesting.SeedInfo.seed([CF309057B9CF373D:910A722A8559CB71]:0) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:882) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:829) at org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions(TestFunctionQuery.java:739) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:502) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:745) Build Log: [...truncated 10099 lines...] [junit4] Suite: org.apache.solr.search.function.TestFunctionQuery [junit4] 2 Creating dataDir:
[JENKINS] Lucene-Solr-trunk-Linux (64bit/jdk1.9.0-ea-b54) - Build # 12033 - Still Failing!
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-trunk-Linux/12033/ Java: 64bit/jdk1.9.0-ea-b54 -XX:-UseCompressedOops -XX:+UseConcMarkSweepGC 1 tests failed. FAILED: org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions Error Message: unexpected map keys [e] @ response/docs/[0] Stack Trace: java.lang.RuntimeException: unexpected map keys [e] @ response/docs/[0] at __randomizedtesting.SeedInfo.seed([178897549DDEEF09:49B27529A1481345]:0) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:882) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:829) at org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions(TestFunctionQuery.java:739) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:502) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:745) Build Log: [...truncated 9390 lines...] [junit4] Suite: org.apache.solr.search.function.TestFunctionQuery [junit4] 2 Creating dataDir:
[JENKINS] Lucene-Solr-trunk-Linux (64bit/jdk1.8.0_40) - Build # 12032 - Still Failing!
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-trunk-Linux/12032/ Java: 64bit/jdk1.8.0_40 -XX:-UseCompressedOops -XX:+UseParallelGC 1 tests failed. FAILED: org.apache.lucene.analysis.uima.UIMABaseAnalyzerTest.testRandomStrings Error Message: some thread(s) failed Stack Trace: java.lang.RuntimeException: some thread(s) failed at org.apache.lucene.analysis.BaseTokenStreamTestCase.checkRandomData(BaseTokenStreamTestCase.java:531) at org.apache.lucene.analysis.BaseTokenStreamTestCase.checkRandomData(BaseTokenStreamTestCase.java:428) at org.apache.lucene.analysis.uima.UIMABaseAnalyzerTest.testRandomStrings(UIMABaseAnalyzerTest.java:125) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:745) Build Log: [...truncated 3891 lines...] [junit4] Suite: org.apache.lucene.analysis.uima.UIMABaseAnalyzerTest [junit4] 2 мар 21, 2015 5:17:08 AM WhitespaceTokenizer initialize [junit4] 2 INFO: Whitespace tokenizer successfully initialized [junit4] 2 мар 21, 2015 5:17:09 AM WhitespaceTokenizer typeSystemInit [junit4] 2 INFO: Whitespace tokenizer typesystem initialized [junit4] 2 мар 21, 2015 5:17:09 AM WhitespaceTokenizer process [junit4] 2 INFO: Whitespace tokenizer starts processing [junit4] 2 мар 21, 2015 5:17:09 AM WhitespaceTokenizer process [junit4] 2
[jira] [Commented] (SOLR-7278) Make ValueSourceAugmenter easier to extend
[ https://issues.apache.org/jira/browse/SOLR-7278?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14372522#comment-14372522 ] ASF subversion and git services commented on SOLR-7278: --- Commit 1668197 from [~ryantxu] in branch 'dev/trunk' [ https://svn.apache.org/r1668197 ] Merged revision(s) 1668195 from lucene/dev/branches/branch_5x: SOLR-7278: oh my... sorry Make ValueSourceAugmenter easier to extend -- Key: SOLR-7278 URL: https://issues.apache.org/jira/browse/SOLR-7278 Project: Solr Issue Type: Improvement Reporter: Ryan McKinley Assignee: Ryan McKinley Priority: Trivial Fix For: Trunk, 5.1 Attachments: SOLR-7278-ValueSourceAugmenter.patch Right now the ValueSourceAugmenter does some hairy work to get the Value and then applies the change do the SolrDocument inline. Lets move modifying the document to a protected function so subclasses can do something different -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-7278) Make ValueSourceAugmenter easier to extend
[ https://issues.apache.org/jira/browse/SOLR-7278?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14372521#comment-14372521 ] ASF subversion and git services commented on SOLR-7278: --- Commit 1668195 from [~ryantxu] in branch 'dev/branches/branch_5x' [ https://svn.apache.org/r1668195 ] SOLR-7278: oh my... sorry Make ValueSourceAugmenter easier to extend -- Key: SOLR-7278 URL: https://issues.apache.org/jira/browse/SOLR-7278 Project: Solr Issue Type: Improvement Reporter: Ryan McKinley Assignee: Ryan McKinley Priority: Trivial Fix For: Trunk, 5.1 Attachments: SOLR-7278-ValueSourceAugmenter.patch Right now the ValueSourceAugmenter does some hairy work to get the Value and then applies the change do the SolrDocument inline. Lets move modifying the document to a protected function so subclasses can do something different -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-trunk-Linux (32bit/jdk1.9.0-ea-b54) - Build # 12031 - Still Failing!
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-trunk-Linux/12031/ Java: 32bit/jdk1.9.0-ea-b54 -server -XX:+UseG1GC 1 tests failed. FAILED: org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions Error Message: unexpected map keys [e] @ response/docs/[0] Stack Trace: java.lang.RuntimeException: unexpected map keys [e] @ response/docs/[0] at __randomizedtesting.SeedInfo.seed([881611D4DD76A4CE:D62CF3A9E1E05882]:0) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:882) at org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:829) at org.apache.solr.search.function.TestFunctionQuery.testPseudoFieldFunctions(TestFunctionQuery.java:739) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:502) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:745) Build Log: [...truncated 9798 lines...] [junit4] Suite: org.apache.solr.search.function.TestFunctionQuery [junit4] 2 Creating dataDir:
[JENKINS] Lucene-Solr-Tests-5.x-Java7 - Build # 2808 - Still Failing
Build: https://builds.apache.org/job/Lucene-Solr-Tests-5.x-Java7/2808/ 6 tests failed. REGRESSION: org.apache.solr.cloud.FullSolrCloudDistribCmdsTest.test Error Message: expected:0 but was:1 Stack Trace: java.lang.AssertionError: expected:0 but was:1 at __randomizedtesting.SeedInfo.seed([AD9F41C8938F04AF:25CB7E123D736957]:0) at org.junit.Assert.fail(Assert.java:93) at org.junit.Assert.failNotEquals(Assert.java:647) at org.junit.Assert.assertEquals(Assert.java:128) at org.junit.Assert.assertEquals(Assert.java:472) at org.junit.Assert.assertEquals(Assert.java:456) at org.apache.solr.cloud.FullSolrCloudDistribCmdsTest.testDeleteByIdCompositeRouterWithRouterField(FullSolrCloudDistribCmdsTest.java:403) at org.apache.solr.cloud.FullSolrCloudDistribCmdsTest.test(FullSolrCloudDistribCmdsTest.java:146) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:958) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:933) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at
Re: [JENKINS] Lucene-Solr-trunk-Linux (64bit/jdk1.8.0_40) - Build # 12032 - Still Failing!
This looks like something hitting a system assertion? On Sat, Mar 21, 2015 at 12:17 AM, Policeman Jenkins Server jenk...@thetaphi.de wrote: Build: http://jenkins.thetaphi.de/job/Lucene-Solr-trunk-Linux/12032/ Java: 64bit/jdk1.8.0_40 -XX:-UseCompressedOops -XX:+UseParallelGC 1 tests failed. FAILED: org.apache.lucene.analysis.uima.UIMABaseAnalyzerTest.testRandomStrings Error Message: some thread(s) failed Stack Trace: java.lang.RuntimeException: some thread(s) failed at org.apache.lucene.analysis.BaseTokenStreamTestCase.checkRandomData(BaseTokenStreamTestCase.java:531) at org.apache.lucene.analysis.BaseTokenStreamTestCase.checkRandomData(BaseTokenStreamTestCase.java:428) at org.apache.lucene.analysis.uima.UIMABaseAnalyzerTest.testRandomStrings(UIMABaseAnalyzerTest.java:125) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:745) Build Log: [...truncated 3891 lines...] [junit4] Suite: org.apache.lucene.analysis.uima.UIMABaseAnalyzerTest [junit4] 2 мар 21, 2015 5:17:08 AM WhitespaceTokenizer initialize [junit4] 2 INFO: Whitespace tokenizer successfully initialized [junit4] 2 мар 21, 2015 5:17:09 AM WhitespaceTokenizer typeSystemInit [junit4] 2 INFO: Whitespace tokenizer
Re: jhighlight-1.0 contains LGPL-only files
although it might be easier to work with Tika to fix that and then upgrade again. If jhighlight was brought into Solr distribution as a transitive dependency then you're right, but since we pull it in explicitly (even if for runtime purposes only), I think we should remove it, whether Tika corrects the problem or not. We can put a note in our NOTICE file for users to download the jar themselves until Tika fixes the problem. If people agree, I will remove it from our code. Shai On Fri, Mar 20, 2015 at 10:40 AM, Karl Wright daddy...@gmail.com wrote: I have created a ticket: TIKA-1581. ManifoldCF also has a Tika dependency, so thank you for noting the problem. Karl On Fri, Mar 20, 2015 at 4:03 AM, Upayavira u...@odoko.co.uk wrote: You are right -both projects need to remove it, although it might be easier to work with Tika to fix that and then upgrade again. Upayavira On Fri, Mar 20, 2015, at 05:26 AM, Shai Erera wrote: Sorry for the spam, just wanted to note that this dependency was added by Steve in SOLR-6130 to resolve improper Tika 1.4-1.5 upgrade. The core issue lies with Tika IMO (they shouldn't rely on LGPL code too I believe), but I am not sure if it's OK that we distribute this .jar ourselves. Shai On Fri, Mar 20, 2015 at 7:17 AM, Shai Erera ser...@gmail.com wrote: One update, I did find this dependency is explicitly set in solr/contrib/extraction/ivy.xml, under the Tika dependencies section: !-- Tika dependencies - see http://tika.apache.org/1.3/gettingstarted.html#Using_Tika_as_a_Maven_dependency -- !-- When upgrading Tika, upgrade dependencies versions and add any new ones (except slf4j-api, commons-codec, commons-logging, commons-httpclient, geronimo-stax-api_1.0_spec, jcip-annotations, xml-apis, asm) WARNING: Don't add netcdf / unidataCommon (partially LGPL code) -- ... dependency org=com.uwyn name=jhighlight rev=${/com.uwyn/jhighlight} conf=compile/ So it does seem like needed by Tika only and I guess it's a runtime dependency, so if we don't want to release this LGPL library, we can omit it and put a section in the NOTICE file? Shai On Fri, Mar 20, 2015 at 7:11 AM, Shai Erera ser...@gmail.com wrote: Hi Solr's contrib/extraction contains jhighlight-1.0.jar which declares itself as dual CDDL or LGPL license. However, some of its classes are distributed only under LGPL, e.g. com.uwyn.jhighlight.highlighter. CppHighlighter.java GroovyHighlighter.java JavaHighlighter.java XmlHighlighter.java I downloaded the sources from Maven ( http://search.maven.org/remotecontent?filepath=com/uwyn/jhighlight/1.0/jhighlight-1.0-sources.jar ) to confirm that, and also found this SVN repo: http://svn.rifers.org/jhighlight/tags/release-1.0, though the project's website seems to not exist anymore (https://jhighlight.dev.java.net/ ). I didn't find any direct usage of it in our code, so I guess it's probably needed by a 3rd party dependency, such as Tika. Therefore if we e.g. omit it, things will compile, but may fail at runtime. Is it OK that we distribute this .jar? Shai - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-5.x-Windows (32bit/jdk1.7.0_76) - Build # 4453 - Still Failing!
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-5.x-Windows/4453/ Java: 32bit/jdk1.7.0_76 -server -XX:+UseG1GC 1 tests failed. FAILED: junit.framework.TestSuite.org.apache.solr.core.TestSolrConfigHandler Error Message: Could not remove the following files (in the order of attempts): C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001\tempDir-010\collection1\conf\configoverlay.json: java.nio.file.FileSystemException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001\tempDir-010\collection1\conf\configoverlay.json: The process cannot access the file because it is being used by another process. C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001\tempDir-010\collection1\conf: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001\tempDir-010\collection1\conf C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001\tempDir-010\collection1: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001\tempDir-010\collection1 C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001\tempDir-010: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001\tempDir-010 C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001\tempDir-010: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001\tempDir-010 C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001 Stack Trace: java.io.IOException: Could not remove the following files (in the order of attempts): C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001\tempDir-010\collection1\conf\configoverlay.json: java.nio.file.FileSystemException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001\tempDir-010\collection1\conf\configoverlay.json: The process cannot access the file because it is being used by another process. C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001\tempDir-010\collection1\conf: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001\tempDir-010\collection1\conf C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001\tempDir-010\collection1: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001\tempDir-010\collection1 C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001\tempDir-010: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001\tempDir-010 C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001\tempDir-010: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001\tempDir-010 C:\Users\JenkinsSlave\workspace\Lucene-Solr-5.x-Windows\solr\build\solr-core\test\J0\temp\solr.core.TestSolrConfigHandler 92DB10590F763AAD-001: java.nio.file.DirectoryNotEmptyException:
[jira] [Commented] (SOLR-7214) JSON Facet API
[ https://issues.apache.org/jira/browse/SOLR-7214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14371242#comment-14371242 ] Steve Molloy commented on SOLR-7214: I think the underlying implementations should be shared. While I agree that SimpleFacets got to the point of being anything but simple... I don't think having completely separate implementations will help at all. Haven't looked at the new code yet, but how hard would it be to roll in pivot stats (and everything under SOLR-6350) into this implementation? I'm thinking that while this new way of passing parameters to facetting is good, we'll still need to support the old way to avoid any pains for users currently doing it the old way. And this should be perfectly fine as after all, we're talking about how to pass parameters, not what to do about them. So, whatever underlying implementation is more solid, easier to maintain, evolve, etc. We should use that and have all functionality work with it. If this new implementation supports all Solr needs, then let's simply have a layer that can parse parameters into a JSON format that will be provided to it. If it's the other way around, let's parse the JSON into parameters for the facet processing. Either way, we should decouple the way to provide parameters from the actual processing, and we should have a single way of performing that processing for facets... JSON Facet API -- Key: SOLR-7214 URL: https://issues.apache.org/jira/browse/SOLR-7214 Project: Solr Issue Type: New Feature Reporter: Yonik Seeley Attachments: SOLR-7214.patch Overview is here: http://yonik.com/json-facet-api/ The structured nature of nested sub-facets are more naturally expressed in a nested structure like JSON rather than the flat structure that normal query parameters provide. Goals: - First class JSON support - Easier programmatic construction of complex nested facet commands - Support a much more canonical response format that is easier for clients to parse - First class analytics support - Support a cleaner way to do distributed faceting - Support better integration with other search features -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-7162) Remove unused SolrSortField interface
[ https://issues.apache.org/jira/browse/SOLR-7162?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14371290#comment-14371290 ] Connor Warrington commented on SOLR-7162: - I was confused as the title of the task included SolrSortField yet the description of the task was SortSortField and I couldn't find SortSortField. I've attached a patch to remove the SolrSortField. Remove unused SolrSortField interface - Key: SOLR-7162 URL: https://issues.apache.org/jira/browse/SOLR-7162 Project: Solr Issue Type: Task Reporter: Shalin Shekhar Mangar Priority: Trivial Fix For: Trunk, 5.1 Attachments: SOLR-7162.patch SortSortField is an unused interface. I can't find any uses in our project. It is also marked as lucene.experimental. Let's nuke it. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-6161) Applying deletes is sometimes dog slow
[ https://issues.apache.org/jira/browse/LUCENE-6161?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14372081#comment-14372081 ] Michael McCandless commented on LUCENE-6161: bq. what would you prefer? In this context, just something easily human readable :) E.g. a float showing seconds... Applying deletes is sometimes dog slow -- Key: LUCENE-6161 URL: https://issues.apache.org/jira/browse/LUCENE-6161 Project: Lucene - Core Issue Type: Bug Reporter: Michael McCandless Assignee: Michael McCandless Fix For: Trunk, 5.1 Attachments: LUCENE-6161.patch, LUCENE-6161.patch, LUCENE-6161.patch, LUCENE-6161.patch, LUCENE-6161.patch I hit this while testing various use cases for LUCENE-6119 (adding auto-throttle to ConcurrentMergeScheduler). When I tested always call updateDocument (each add buffers a delete term), with many indexing threads, opening an NRT reader once per second (forcing all deleted terms to be applied), I see that BufferedUpdatesStream.applyDeletes sometimes seems to take a lng time, e.g.: {noformat} BD 0 [2015-01-04 09:31:12.597; Lucene Merge Thread #69]: applyDeletes took 339 msec for 10 segments, 117 deleted docs, 607333 visited terms BD 0 [2015-01-04 09:31:18.148; Thread-4]: applyDeletes took 5533 msec for 62 segments, 10989 deleted docs, 8517225 visited terms BD 0 [2015-01-04 09:31:21.463; Lucene Merge Thread #71]: applyDeletes took 1065 msec for 10 segments, 470 deleted docs, 1825649 visited terms BD 0 [2015-01-04 09:31:26.301; Thread-5]: applyDeletes took 4835 msec for 61 segments, 14676 deleted docs, 9649860 visited terms BD 0 [2015-01-04 09:31:35.572; Thread-11]: applyDeletes took 6073 msec for 72 segments, 13835 deleted docs, 11865319 visited terms BD 0 [2015-01-04 09:31:37.604; Lucene Merge Thread #75]: applyDeletes took 251 msec for 10 segments, 58 deleted docs, 240721 visited terms BD 0 [2015-01-04 09:31:44.641; Thread-11]: applyDeletes took 5956 msec for 64 segments, 15109 deleted docs, 10599034 visited terms BD 0 [2015-01-04 09:31:47.814; Lucene Merge Thread #77]: applyDeletes took 396 msec for 10 segments, 137 deleted docs, 719914 visit {noformat} What this means is even though I want an NRT reader every second, often I don't get one for up to ~7 or more seconds. This is on an SSD, machine has 48 GB RAM, heap size is only 2 GB. 12 indexing threads. As hideously complex as this code is, I think there are some inefficiencies, but fixing them could be hard / make code even hairier ... Also, this code is mega-locked: holds IW's lock, holds BD's lock. It blocks things like merges kicking off or finishing... E.g., we pull the MergedIterator many times on the same set of sub-iterators. Maybe we can create the sorted terms up front and reuse that? Maybe we should go term stride (one term visits all N segments) not segment stride (visit each segment, iterating all deleted terms for it). Just iterating the terms to be deleted takes a sizable part of the time, and we now do that once for every segment in the index. Also, the isUnique bit in LUCENE-6005 should help here, since if we know the field is unique, we can stop seekExact once we found a segment that has the deleted term, we can maybe pass false for removeDuplicates to MergedIterator... -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (SOLR-7263) Add files tab support to AngularJS Admin UI
[ https://issues.apache.org/jira/browse/SOLR-7263?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Upayavira updated SOLR-7263: Attachment: SOLR-7263.patch Completed the 'files' tab. Add files tab support to AngularJS Admin UI - Key: SOLR-7263 URL: https://issues.apache.org/jira/browse/SOLR-7263 Project: Solr Issue Type: Improvement Components: web gui Reporter: Upayavira Priority: Minor Attachments: SOLR-7263.patch -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Created] (SOLR-7278) Make ValueSourceAugmenter easier to extend
Ryan McKinley created SOLR-7278: --- Summary: Make ValueSourceAugmenter easier to extend Key: SOLR-7278 URL: https://issues.apache.org/jira/browse/SOLR-7278 Project: Solr Issue Type: Improvement Reporter: Ryan McKinley Assignee: Ryan McKinley Priority: Trivial Fix For: Trunk, 5.1 Right now the ValueSourceAugmenter does some hairy work to get the Value and then applies the change do the SolrDocument inline. Lets move modifying the document to a protected function so subclasses can do something different -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (SOLR-7278) Make ValueSourceAugmenter easier to extend
[ https://issues.apache.org/jira/browse/SOLR-7278?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ryan McKinley updated SOLR-7278: Attachment: SOLR-7278-ValueSourceAugmenter.patch very simple patch Make ValueSourceAugmenter easier to extend -- Key: SOLR-7278 URL: https://issues.apache.org/jira/browse/SOLR-7278 Project: Solr Issue Type: Improvement Reporter: Ryan McKinley Assignee: Ryan McKinley Priority: Trivial Fix For: Trunk, 5.1 Attachments: SOLR-7278-ValueSourceAugmenter.patch Right now the ValueSourceAugmenter does some hairy work to get the Value and then applies the change do the SolrDocument inline. Lets move modifying the document to a protected function so subclasses can do something different -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Created] (SOLR-7279) Add plugins/stats tab support to Angular Admin UI
Upayavira created SOLR-7279: --- Summary: Add plugins/stats tab support to Angular Admin UI Key: SOLR-7279 URL: https://issues.apache.org/jira/browse/SOLR-7279 Project: Solr Issue Type: Bug Components: web gui Reporter: Upayavira Priority: Minor -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-7278) Make ValueSourceAugmenter easier to extend
[ https://issues.apache.org/jira/browse/SOLR-7278?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14372121#comment-14372121 ] Yonik Seeley commented on SOLR-7278: +1, looks fine. Make ValueSourceAugmenter easier to extend -- Key: SOLR-7278 URL: https://issues.apache.org/jira/browse/SOLR-7278 Project: Solr Issue Type: Improvement Reporter: Ryan McKinley Assignee: Ryan McKinley Priority: Trivial Fix For: Trunk, 5.1 Attachments: SOLR-7278-ValueSourceAugmenter.patch Right now the ValueSourceAugmenter does some hairy work to get the Value and then applies the change do the SolrDocument inline. Lets move modifying the document to a protected function so subclasses can do something different -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-7278) Make ValueSourceAugmenter easier to extend
[ https://issues.apache.org/jira/browse/SOLR-7278?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14372128#comment-14372128 ] ASF subversion and git services commented on SOLR-7278: --- Commit 1668149 from [~ryantxu] in branch 'dev/branches/branch_5x' [ https://svn.apache.org/r1668149 ] SOLR-7278: Make ValueSourceAugmenter easier to extend Make ValueSourceAugmenter easier to extend -- Key: SOLR-7278 URL: https://issues.apache.org/jira/browse/SOLR-7278 Project: Solr Issue Type: Improvement Reporter: Ryan McKinley Assignee: Ryan McKinley Priority: Trivial Fix For: Trunk, 5.1 Attachments: SOLR-7278-ValueSourceAugmenter.patch Right now the ValueSourceAugmenter does some hairy work to get the Value and then applies the change do the SolrDocument inline. Lets move modifying the document to a protected function so subclasses can do something different -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-7278) Make ValueSourceAugmenter easier to extend
[ https://issues.apache.org/jira/browse/SOLR-7278?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14372131#comment-14372131 ] ASF subversion and git services commented on SOLR-7278: --- Commit 1668151 from [~ryantxu] in branch 'dev/trunk' [ https://svn.apache.org/r1668151 ] Merged revision(s) 1668149 from lucene/dev/branches/branch_5x: SOLR-7278: Make ValueSourceAugmenter easier to extend Make ValueSourceAugmenter easier to extend -- Key: SOLR-7278 URL: https://issues.apache.org/jira/browse/SOLR-7278 Project: Solr Issue Type: Improvement Reporter: Ryan McKinley Assignee: Ryan McKinley Priority: Trivial Fix For: Trunk, 5.1 Attachments: SOLR-7278-ValueSourceAugmenter.patch Right now the ValueSourceAugmenter does some hairy work to get the Value and then applies the change do the SolrDocument inline. Lets move modifying the document to a protected function so subclasses can do something different -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (SOLR-7276) Add a Boolean Post Filter QParserPlugin
[ https://issues.apache.org/jira/browse/SOLR-7276?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ted Sullivan updated SOLR-7276: --- Attachment: SOLR-7276.patch Add a Boolean Post Filter QParserPlugin --- Key: SOLR-7276 URL: https://issues.apache.org/jira/browse/SOLR-7276 Project: Solr Issue Type: New Feature Reporter: Ted Sullivan Attachments: SOLR-7276.patch This plugin enables existing post filter implementations to be combined in using Boolean logic. It works by building a parse tree of referenced Post Filters. When a document is sent to the collect( ) method of the BooleanPostFilter, it is sent to all of the delegates that point to a local Collector that sets a flag if the DelegatingCollector calls its collect method. After all of the delegates have been polled, the parse tree output determines if the document should be ultimately collected. The syntax for the post filter is like this: {noformat} fq={!bool expr=(($foo OR $bar) NOT $baz)}foo={!foo ...}bar={!bar ... }baz={!baz ...} {noformat} Where foo, bar and baz are all post filters. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-7214) JSON Facet API
[ https://issues.apache.org/jira/browse/SOLR-7214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14371487#comment-14371487 ] Yonik Seeley commented on SOLR-7214: bq. it's the we now have 2 approaches. 3 approaches... you're forgetting the analytics component ;-) JSON Facet API -- Key: SOLR-7214 URL: https://issues.apache.org/jira/browse/SOLR-7214 Project: Solr Issue Type: New Feature Reporter: Yonik Seeley Attachments: SOLR-7214.patch Overview is here: http://yonik.com/json-facet-api/ The structured nature of nested sub-facets are more naturally expressed in a nested structure like JSON rather than the flat structure that normal query parameters provide. Goals: - First class JSON support - Easier programmatic construction of complex nested facet commands - Support a much more canonical response format that is easier for clients to parse - First class analytics support - Support a cleaner way to do distributed faceting - Support better integration with other search features -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-6865) Upgrade HttpClient to 4.4
[ https://issues.apache.org/jira/browse/SOLR-6865?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14371814#comment-14371814 ] Shawn Heisey commented on SOLR-6865: The httpcore module has advanced to 4.4.1, asking on the HC list to find out if any of the other modules will see a 4.4.1 release. Upgrade HttpClient to 4.4 - Key: SOLR-6865 URL: https://issues.apache.org/jira/browse/SOLR-6865 Project: Solr Issue Type: Task Affects Versions: 5.0 Reporter: Shawn Heisey Priority: Minor Fix For: Trunk, 5.1 Attachments: SOLR-6865.patch HttpClient 4.4 has been released. 5.0 seems like a good time to upgrade. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-7245) Temporary ZK election or connection loss should not stall indexing due to LIR
[ https://issues.apache.org/jira/browse/SOLR-7245?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14371873#comment-14371873 ] Timothy Potter commented on SOLR-7245: -- Patch looks good at first look, but wanted to make sure we coordinate with [~shalinmangar] on SOLR-7109 Temporary ZK election or connection loss should not stall indexing due to LIR - Key: SOLR-7245 URL: https://issues.apache.org/jira/browse/SOLR-7245 Project: Solr Issue Type: Improvement Components: SolrCloud Reporter: Ramkumar Aiyengar Assignee: Ramkumar Aiyengar Priority: Minor Attachments: SOLR-7245.patch, SOLR-7245.patch If there's a ZK election or connection loss, and the leader is unable to reach a replica, it currently would stall till the ZK connection is established, due to the LIR process. This shouldn't happen, and in some way regresses the work done in SOLR-5577. I will try get to this, but if someone races me to it, feel free to.. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Solr-Artifacts-5.0 - Build # 62 - Failure
Build: https://builds.apache.org/job/Solr-Artifacts-5.0/62/ No tests ran. Build Log: [...truncated 34519 lines...] BUILD FAILED /usr/home/jenkins/jenkins-slave/workspace/Solr-Artifacts-5.0/solr/build.xml:381: java.net.ConnectException: Operation timed out at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) at java.net.Socket.connect(Socket.java:579) at java.net.Socket.connect(Socket.java:528) at sun.net.NetworkClient.doConnect(NetworkClient.java:180) at sun.net.www.http.HttpClient.openServer(HttpClient.java:432) at sun.net.www.http.HttpClient.openServer(HttpClient.java:527) at sun.net.www.http.HttpClient.init(HttpClient.java:211) at sun.net.www.http.HttpClient.New(HttpClient.java:308) at sun.net.www.http.HttpClient.New(HttpClient.java:326) at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:996) at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:932) at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:850) at org.apache.tools.ant.taskdefs.Get$GetThread.openConnection(Get.java:660) at org.apache.tools.ant.taskdefs.Get$GetThread.get(Get.java:579) at org.apache.tools.ant.taskdefs.Get$GetThread.run(Get.java:569) Total time: 14 minutes 18 seconds Build step 'Invoke Ant' marked build as failure Archiving artifacts Sending artifact delta relative to Solr-Artifacts-5.0 #61 Archived 13 artifacts Archive block size is 32768 Received 2262 blocks and 225849290 bytes Compression is 24.7% Took 1 min 53 sec Publishing Javadoc Email was triggered for: Failure Sending email for trigger: Failure - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-5.x-Linux (64bit/jdk1.8.0_40) - Build # 11863 - Still Failing!
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-5.x-Linux/11863/ Java: 64bit/jdk1.8.0_40 -XX:-UseCompressedOops -XX:+UseParallelGC 1 tests failed. FAILED: org.apache.solr.cloud.ChaosMonkeyNothingIsSafeTest.test Error Message: There were too many update fails - we expect it can happen, but shouldn't easily Stack Trace: java.lang.AssertionError: There were too many update fails - we expect it can happen, but shouldn't easily at __randomizedtesting.SeedInfo.seed([16BD81B8DEAA3541:9EE9BE62705658B9]:0) at org.junit.Assert.fail(Assert.java:93) at org.junit.Assert.assertTrue(Assert.java:43) at org.junit.Assert.assertFalse(Assert.java:68) at org.apache.solr.cloud.ChaosMonkeyNothingIsSafeTest.test(ChaosMonkeyNothingIsSafeTest.java:222) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:958) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:933) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at
[jira] [Created] (SOLR-7277) DocBuild should pass EntityProcessorWrapper in notifyListener
Alex Genaille created SOLR-7277: --- Summary: DocBuild should pass EntityProcessorWrapper in notifyListener Key: SOLR-7277 URL: https://issues.apache.org/jira/browse/SOLR-7277 Project: Solr Issue Type: Bug Components: contrib - DataImportHandler Affects Versions: 5.0 Environment: Windows 7; Apache Solr 5.0.0 Reporter: Alex Genaille Within an onImportEnd listener, I would like to be able to do this: {code} String entityName = context.getEntityAttribute(name); {code} Because the DocBuilder.notifyListener passes a null EntityProcessorWrapper, the entity is not accessible at all in the onImportEnd event. Suggested Fix: DocBuilder.notifyListener should pass the currentEntityProcessorWrapper into the ContextImpl constructor on line 173 of DocBuilder.java {code} private void notifyListener(EventListener listener, Exception lastException) { String currentProcess; if (dataImporter.getStatus() == DataImporter.Status.RUNNING_DELTA_DUMP) { currentProcess = Context.DELTA_DUMP; } else { currentProcess = Context.FULL_DUMP; } ContextImpl ctx = new ContextImpl(null, getVariableResolver(), null, currentProcess, session, null, this); //FIX: PASS first argument ctx.setLastException(lastException); listener.onEvent(ctx); } {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-NightlyTests-5.x - Build # 792 - Still Failing
Build: https://builds.apache.org/job/Lucene-Solr-NightlyTests-5.x/792/ 4 tests failed. REGRESSION: org.apache.solr.cloud.LeaderFailoverAfterPartitionTest.test Error Message: org.apache.solr.client.solrj.SolrServerException: Timeout occured while waiting response from server at: http://127.0.0.1:29770/c8n_1x3_lf_shard1_replica3 Stack Trace: org.apache.solr.client.solrj.SolrServerException: org.apache.solr.client.solrj.SolrServerException: Timeout occured while waiting response from server at: http://127.0.0.1:29770/c8n_1x3_lf_shard1_replica3 at __randomizedtesting.SeedInfo.seed([2281FD7A30CC475D:AAD5C2A09E302AA5]:0) at org.apache.solr.client.solrj.impl.CloudSolrClient.directUpdate(CloudSolrClient.java:625) at org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:948) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:839) at org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:782) at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1220) at org.apache.solr.cloud.AbstractFullDistribZkTestBase.sendDocsWithRetry(AbstractFullDistribZkTestBase.java:790) at org.apache.solr.cloud.HttpPartitionTest.sendDoc(HttpPartitionTest.java:483) at org.apache.solr.cloud.LeaderFailoverAfterPartitionTest.testRf3WithLeaderFailover(LeaderFailoverAfterPartitionTest.java:172) at org.apache.solr.cloud.LeaderFailoverAfterPartitionTest.test(LeaderFailoverAfterPartitionTest.java:51) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:958) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:933) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at
[jira] [Commented] (LUCENE-5879) Add auto-prefix terms to block tree terms dict
[ https://issues.apache.org/jira/browse/LUCENE-5879?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14371671#comment-14371671 ] Robert Muir commented on LUCENE-5879: - {quote} I.e., with the patch as it is now, PFs like SimpleText will use a PrefixTermsEnum for PrefixQuery, but if I fix PrefixQuery to subclass AutomatonQuery (and remove AUTOMATON_TYPE.PREFIX) then SimpleText would use AutomatonTermsEnum (on a prefix automaton) which I think will be somewhat less efficient? Maybe it's not so bad in practice? ATE would realize it's in a linear part of the automaton... {quote} We cannot continue writing code in this way. Please let intersect take care of how to intersect and get this shit out of the Query. The default Terms.intersect() method can specialize the PREFIX case with a PrefixTermsEnum if it is faster. Add auto-prefix terms to block tree terms dict -- Key: LUCENE-5879 URL: https://issues.apache.org/jira/browse/LUCENE-5879 Project: Lucene - Core Issue Type: New Feature Components: core/codecs Reporter: Michael McCandless Assignee: Michael McCandless Fix For: 5.0, Trunk Attachments: LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch This cool idea to generalize numeric/trie fields came from Adrien: Today, when we index a numeric field (LongField, etc.) we pre-compute (via NumericTokenStream) outside of indexer/codec which prefix terms should be indexed. But this can be inefficient: you set a static precisionStep, and always add those prefix terms regardless of how the terms in the field are actually distributed. Yet typically in real world applications the terms have a non-random distribution. So, it should be better if instead the terms dict decides where it makes sense to insert prefix terms, based on how dense the terms are in each region of term space. This way we can speed up query time for both term (e.g. infix suggester) and numeric ranges, and it should let us use less index space and get faster range queries. This would also mean that min/maxTerm for a numeric field would now be correct, vs today where the externally computed prefix terms are placed after the full precision terms, causing hairy code like NumericUtils.getMaxInt/Long. So optos like LUCENE-5860 become feasible. The terms dict can also do tricks not possible if you must live on top of its APIs, e.g. to handle the adversary/over-constrained case when a given prefix has too many terms following it but finer prefixes have too few (what block tree calls floor term blocks). -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-7245) Temporary ZK election or connection loss should not stall indexing due to LIR
[ https://issues.apache.org/jira/browse/SOLR-7245?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14371710#comment-14371710 ] Timothy Potter commented on SOLR-7245: -- Taking a look, thanks for the heads up. Temporary ZK election or connection loss should not stall indexing due to LIR - Key: SOLR-7245 URL: https://issues.apache.org/jira/browse/SOLR-7245 Project: Solr Issue Type: Improvement Components: SolrCloud Reporter: Ramkumar Aiyengar Assignee: Ramkumar Aiyengar Priority: Minor Attachments: SOLR-7245.patch, SOLR-7245.patch If there's a ZK election or connection loss, and the leader is unable to reach a replica, it currently would stall till the ZK connection is established, due to the LIR process. This shouldn't happen, and in some way regresses the work done in SOLR-5577. I will try get to this, but if someone races me to it, feel free to.. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (SOLR-7277) DocBuilder should pass EntityProcessorWrapper in notifyListener
[ https://issues.apache.org/jira/browse/SOLR-7277?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Alex Genaille updated SOLR-7277: Summary: DocBuilder should pass EntityProcessorWrapper in notifyListener (was: DocBuild should pass EntityProcessorWrapper in notifyListener) DocBuilder should pass EntityProcessorWrapper in notifyListener --- Key: SOLR-7277 URL: https://issues.apache.org/jira/browse/SOLR-7277 Project: Solr Issue Type: Bug Components: contrib - DataImportHandler Affects Versions: 5.0 Environment: Windows 7; Apache Solr 5.0.0 Reporter: Alex Genaille Within an onImportEnd listener, I would like to be able to do this: {code} String entityName = context.getEntityAttribute(name); {code} Because the DocBuilder.notifyListener passes a null EntityProcessorWrapper, the entity is not accessible at all in the onImportEnd event. Suggested Fix: DocBuilder.notifyListener should pass the currentEntityProcessorWrapper into the ContextImpl constructor on line 173 of DocBuilder.java {code} private void notifyListener(EventListener listener, Exception lastException) { String currentProcess; if (dataImporter.getStatus() == DataImporter.Status.RUNNING_DELTA_DUMP) { currentProcess = Context.DELTA_DUMP; } else { currentProcess = Context.FULL_DUMP; } ContextImpl ctx = new ContextImpl(null, getVariableResolver(), null, currentProcess, session, null, this); //FIX: PASS first argument ctx.setLastException(lastException); listener.onEvent(ctx); } {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (SOLR-5911) Cannot store term vector payloads
[ https://issues.apache.org/jira/browse/SOLR-5911?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] David Smiley updated SOLR-5911: --- Attachment: SOLR-5911.patch I reviewed the patch and brought it up to date with trunk. Nice thorough job Mike! At first I was just thinking this was a small matter of the schema but you thought of the TermVectorComponent, LukeRequestHandler, etc. I did make a change to TermVectorComponent.mapOneVector so that the postings flag indicates the options we want... and I simplified the code a little to not need the 3 useOffsets|Positions|Payloads booleans which seemed redundant with the same booleans on fieldOptions. Tests pass, precommit passes. If you don't have time to commit then I will be happy to. Cannot store term vector payloads - Key: SOLR-5911 URL: https://issues.apache.org/jira/browse/SOLR-5911 Project: Solr Issue Type: Improvement Reporter: Michael McCandless Fix For: 4.9, Trunk Attachments: SOLR-5911.patch, SOLR-5911.patch Lucene's term vectors can now store payloads, but it looks like this was never exposed in Solr. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-Tests-5.x-Java7 - Build # 2805 - Still Failing
Build: https://builds.apache.org/job/Lucene-Solr-Tests-5.x-Java7/2805/ 3 tests failed. FAILED: org.apache.solr.cloud.LeaderInitiatedRecoveryOnCommitTest.test Error Message: IOException occured when talking to server at: http://127.0.0.1:58609/abteb/c8n_1x3_commits_shard1_replica3 Stack Trace: org.apache.solr.client.solrj.SolrServerException: IOException occured when talking to server at: http://127.0.0.1:58609/abteb/c8n_1x3_commits_shard1_replica3 at __randomizedtesting.SeedInfo.seed([BE2CE2F1521828EC:3678DD2BFCE44514]:0) at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:598) at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:236) at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:228) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:135) at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:483) at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:464) at org.apache.solr.cloud.LeaderInitiatedRecoveryOnCommitTest.oneShardTest(LeaderInitiatedRecoveryOnCommitTest.java:130) at org.apache.solr.cloud.LeaderInitiatedRecoveryOnCommitTest.test(LeaderInitiatedRecoveryOnCommitTest.java:62) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:958) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:933) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
Re: Tests with beating hearts
I wouldn't want to see the thread stacks until the test finally timed out at which point I'd like to see N of them. I don't think it will be possible since there is no explicit trigger that signals a timeout... although maybe there is -- I think the test framework attempts to send an interrupt signal to all threads within the test group; maybe this could be used as a stimulus for dumping previously saved stack traces. I once had a more intelligent periodic stack analyzer -- one that analyzed a series of stack traces and looked for the common root, the diverging stack frame, etc. It could tell you immediately whether a given thread was stalled on something or if it was running in a loop under a given method, etc. It should be in randomized runner's history. It would spawn a new thread for each test case right? And it'd have to stop that thread when the test completes (success or failure)... Pretty much. Otherwise you'd run into problems with thread leak detection that's built into the runner. Or you could make this thread run per the entire suite, not for each individual test -- this would be much faster (in particular in the presence of repeats). Dawid - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-6226) Add interval iterators to Scorer
[ https://issues.apache.org/jira/browse/LUCENE-6226?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14371577#comment-14371577 ] Paul Elschot commented on LUCENE-6226: -- This issue is about adding interval/span iterators to scorer, and I think some of the scoring for intervals is better left to the intervals/spans themselves, see my previous posts. I agree that a separate issue for that is better. I'd like to have the possibity to somehow converge the code at LUCENE-6308 to the code here. This is actually also a separate issue, but anyway. At LUCENE-6308 an interval/spans iterator (SpansEnum) is an extension of DocIdSetIterator, mostly because that allows a move away from Spans that is not too large. The IntervalIterator here is an interface that does not have methods directly related to doc id set iteration. So the question is: is there an implementation of the IntervalIterator here that is used without the context of a DocIdSetIterator ? Add interval iterators to Scorer Key: LUCENE-6226 URL: https://issues.apache.org/jira/browse/LUCENE-6226 Project: Lucene - Core Issue Type: Improvement Reporter: Alan Woodward Assignee: Alan Woodward Fix For: Trunk, 5.1 Attachments: LUCENE-6226.patch, LUCENE-6226.patch, LUCENE-6226.patch, LUCENE-6226.patch, LUCENE-6226.patch, LUCENE-6226.patch, LUCENE-6226.patch This change will allow Scorers to expose which positions within a document they have matched, via a new IntervalIterator interface. Consumers get the iterator by calling intervals() on the Scorer, then call reset(docId) whenever the scorer has advanced and nextInterval() to iterate through positions. Once all matching intervals on the current document have been exhausted, nextInterval() returns false. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-5.x-Linux (32bit/jdk1.7.0_80-ea-b05) - Build # 11862 - Failure!
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-5.x-Linux/11862/ Java: 32bit/jdk1.7.0_80-ea-b05 -server -XX:+UseParallelGC 1 tests failed. FAILED: org.apache.solr.cloud.FullSolrCloudDistribCmdsTest.test Error Message: Error from server at http://127.0.0.1:38053/compositeid_collection_with_routerfield_shard1_replica1: no servers hosting shard: Stack Trace: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:38053/compositeid_collection_with_routerfield_shard1_replica1: no servers hosting shard: at __randomizedtesting.SeedInfo.seed([EB1E4330CD77C8D8:634A7CEA638BA520]:0) at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:584) at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:236) at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:228) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:135) at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:943) at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:958) at org.apache.solr.cloud.FullSolrCloudDistribCmdsTest.testDeleteByIdCompositeRouterWithRouterField(FullSolrCloudDistribCmdsTest.java:357) at org.apache.solr.cloud.FullSolrCloudDistribCmdsTest.test(FullSolrCloudDistribCmdsTest.java:146) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:958) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:933) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at
Re: failure from excessive output
This is the same issue as Hoss reporter earlier. Thread leak detection is largely ignored in Solr tests -- it should be fixed, obviously, but I don't know what the scope of the changes would be if we removed the offending threads from filters. https://issues.apache.org/jira/browse/SOLR-7215 Dawid On Fri, Mar 20, 2015 at 9:17 PM, Yonik Seeley ysee...@gmail.com wrote: Just got a failure from a test that doesn't have any output at all testcase classname=junit.framework.TestSuite name=org.apache.solr.search.TestDocSet time=0.0 failure message=The test or suite printed 10982 bytes to stdout and stderr, even though the limit was set to 8192 bytes. Increase the limit with @Limit, ignore it completely with @SuppressSysoutChecks or run with -Dtests.verbose=true type=java.lang.AssertionErrorjava.lang.AssertionError: The test or suite printed 10982 bytes to stdout and stderr, even though the limit was set to 8192 bytes. Increase the limit with @Limit, ignore it completely with @SuppressSysoutChecks or run with -Dtests.verbose=true at __randomizedtesting.SeedInfo.seed([63638DD5324A94A2]:0) Looking at tests-report.txt though, perhaps it's just thread leaks from other tests? [15:51:04.358] OK 0.11s J1 | TestDocSet.testFilter 2 1365229 T1109 oahh.LeaseRenewer.run WARN Failed to renew lease for [DFSClient_NONMAPREDUCE_-144622376_992] for 1130 seconds. Will retry shortly ... java.net .ConnectException: Call From odin/127.0.1.1 to localhost:33373 failed on connection exception: java.net.ConnectException: Connection refused; For more details see : http://wiki.apache.org/hadoop/ConnectionRefused 2at sun.reflect.GeneratedConstructorAccessor232.newInstance(Unknown Source) 2at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2at java.lang.reflect.Constructor.newInstance(Constructor.java:408) 2at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783) 2at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730) 2at org.apache.hadoop.ipc.Client.call(Client.java:1410) 2at org.apache.hadoop.ipc.Client.call(Client.java:1359) 2at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) 2at com.sun.proxy.$Proxy42.renewLease(Unknown Source) 2at sun.reflect.GeneratedMethodAccessor52.invoke(Unknown Source) 2at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2at java.lang.reflect.Method.invoke(Method.java:483) 2at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186) 2at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) 2at com.sun.proxy.$Proxy42.renewLease(Unknown Source) 2at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.renewLease(ClientNamenodeProtocolTranslatorPB.java:519) 2at org.apache.hadoop.hdfs.DFSClient.renewLease(DFSClient.java:773) 2at org.apache.hadoop.hdfs.LeaseRenewer.renew(LeaseRenewer.java:417) 2at org.apache.hadoop.hdfs.LeaseRenewer.run(LeaseRenewer.java:442) 2at org.apache.hadoop.hdfs.LeaseRenewer.access$700(LeaseRenewer.java:71) 2at org.apache.hadoop.hdfs.LeaseRenewer$1.run(LeaseRenewer.java:298) 2at java.lang.Thread.run(Thread.java:745) 2 Caused by: java.net.ConnectException: Connection refused 2at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) 2at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) 2at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529) 2at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493) 2at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:601) 2at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:696) 2at org.apache.hadoop.ipc.Client$Connection.access$2700(Client.java:367) 2at org.apache.hadoop.ipc.Client.getConnection(Client.java:1458) 2at org.apache.hadoop.ipc.Client.call(Client.java:1377) 2... 16 more 2 -Yonik - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
Re: failure from excessive output
I think most of the Solr tests don’t leak threads at this point. Whatever is left should be very easy / minor to address. The HDFS tests do still have thread leaks of underlying HDFS stuff. Eventually we should be on a version that doesn’t do that, but at about open source speed :) - Mark http://about.me/markrmiller On Mar 20, 2015, at 4:24 PM, Dawid Weiss dawid.we...@cs.put.poznan.pl wrote: This is the same issue as Hoss reporter earlier. Thread leak detection is largely ignored in Solr tests -- it should be fixed, obviously, but I don't know what the scope of the changes would be if we removed the offending threads from filters. https://issues.apache.org/jira/browse/SOLR-7215 Dawid On Fri, Mar 20, 2015 at 9:17 PM, Yonik Seeley ysee...@gmail.com wrote: Just got a failure from a test that doesn't have any output at all testcase classname=junit.framework.TestSuite name=org.apache.solr.search.TestDocSet time=0.0 failure message=The test or suite printed 10982 bytes to stdout and stderr, even though the limit was set to 8192 bytes. Increase the limit with @Limit, ignore it completely with @SuppressSysoutChecks or run with -Dtests.verbose=true type=java.lang.AssertionErrorjava.lang.AssertionError: The test or suite printed 10982 bytes to stdout and stderr, even though the limit was set to 8192 bytes. Increase the limit with @Limit, ignore it completely with @SuppressSysoutChecks or run with -Dtests.verbose=true at __randomizedtesting.SeedInfo.seed([63638DD5324A94A2]:0) Looking at tests-report.txt though, perhaps it's just thread leaks from other tests? [15:51:04.358] OK 0.11s J1 | TestDocSet.testFilter 2 1365229 T1109 oahh.LeaseRenewer.run WARN Failed to renew lease for [DFSClient_NONMAPREDUCE_-144622376_992] for 1130 seconds. Will retry shortly ... java.net .ConnectException: Call From odin/127.0.1.1 to localhost:33373 failed on connection exception: java.net.ConnectException: Connection refused; For more details see : http://wiki.apache.org/hadoop/ConnectionRefused 2at sun.reflect.GeneratedConstructorAccessor232.newInstance(Unknown Source) 2at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2at java.lang.reflect.Constructor.newInstance(Constructor.java:408) 2at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783) 2at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730) 2at org.apache.hadoop.ipc.Client.call(Client.java:1410) 2at org.apache.hadoop.ipc.Client.call(Client.java:1359) 2at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) 2at com.sun.proxy.$Proxy42.renewLease(Unknown Source) 2at sun.reflect.GeneratedMethodAccessor52.invoke(Unknown Source) 2at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2at java.lang.reflect.Method.invoke(Method.java:483) 2at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186) 2at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) 2at com.sun.proxy.$Proxy42.renewLease(Unknown Source) 2at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.renewLease(ClientNamenodeProtocolTranslatorPB.java:519) 2at org.apache.hadoop.hdfs.DFSClient.renewLease(DFSClient.java:773) 2at org.apache.hadoop.hdfs.LeaseRenewer.renew(LeaseRenewer.java:417) 2at org.apache.hadoop.hdfs.LeaseRenewer.run(LeaseRenewer.java:442) 2at org.apache.hadoop.hdfs.LeaseRenewer.access$700(LeaseRenewer.java:71) 2at org.apache.hadoop.hdfs.LeaseRenewer$1.run(LeaseRenewer.java:298) 2at java.lang.Thread.run(Thread.java:745) 2 Caused by: java.net.ConnectException: Connection refused 2at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) 2at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) 2at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529) 2at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493) 2at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:601) 2at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:696) 2at org.apache.hadoop.ipc.Client$Connection.access$2700(Client.java:367) 2at org.apache.hadoop.ipc.Client.getConnection(Client.java:1458) 2at org.apache.hadoop.ipc.Client.call(Client.java:1377) 2... 16 more 2 -Yonik - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-7245) Temporary ZK election or connection loss should not stall indexing due to LIR
[ https://issues.apache.org/jira/browse/SOLR-7245?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14372040#comment-14372040 ] Ramkumar Aiyengar commented on SOLR-7245: - bq. Patch looks good at first look, but wanted to make sure we coordinate with Shalin Shekhar Mangar on SOLR-7109 Hopefully should be. I actually noticed this when I was reviewing the patch for that issue. But Shalin, let me know if otherwise.. Temporary ZK election or connection loss should not stall indexing due to LIR - Key: SOLR-7245 URL: https://issues.apache.org/jira/browse/SOLR-7245 Project: Solr Issue Type: Improvement Components: SolrCloud Reporter: Ramkumar Aiyengar Assignee: Ramkumar Aiyengar Priority: Minor Attachments: SOLR-7245.patch, SOLR-7245.patch If there's a ZK election or connection loss, and the leader is unable to reach a replica, it currently would stall till the ZK connection is established, due to the LIR process. This shouldn't happen, and in some way regresses the work done in SOLR-5577. I will try get to this, but if someone races me to it, feel free to.. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (SOLR-7216) JSON Request API
[ https://issues.apache.org/jira/browse/SOLR-7216?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yonik Seeley updated SOLR-7216: --- Attachment: SOLR-7216.patch Here's a patch finishing up this issue (most of it was committed as part of SOLR-7214) since they were intertwined. - removes exception when search request has a body - adds json to debugging output - adds tests JSON Request API Key: SOLR-7216 URL: https://issues.apache.org/jira/browse/SOLR-7216 Project: Solr Issue Type: New Feature Reporter: Yonik Seeley Attachments: SOLR-7216.patch hhttp://yonik.com/solr-json-request-api/ The drawbacks to only having a query-parameter API include: - Inherently un-structured, requiring unsightly parameters like f.facet_name.facet.range.start=5 - Inherently un-typed… everything is a string. - More difficult to decipher large requests. - Harder to programmatically create a request. - Impossible to validate. Because of the lack of structure, we don’t know the set of valid parameter and thus can’t do good error checking. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
Re: failure from excessive output
One of those filters can be removed -- the thread that was previously calling system.exit now could be interrupted (it should fail with access denied when calling system.exit :). D. On Fri, Mar 20, 2015 at 9:29 PM, Mark Miller markrmil...@gmail.com wrote: I think most of the Solr tests don’t leak threads at this point. Whatever is left should be very easy / minor to address. The HDFS tests do still have thread leaks of underlying HDFS stuff. Eventually we should be on a version that doesn’t do that, but at about open source speed :) - Mark http://about.me/markrmiller On Mar 20, 2015, at 4:24 PM, Dawid Weiss dawid.we...@cs.put.poznan.pl wrote: This is the same issue as Hoss reporter earlier. Thread leak detection is largely ignored in Solr tests -- it should be fixed, obviously, but I don't know what the scope of the changes would be if we removed the offending threads from filters. https://issues.apache.org/jira/browse/SOLR-7215 Dawid On Fri, Mar 20, 2015 at 9:17 PM, Yonik Seeley ysee...@gmail.com wrote: Just got a failure from a test that doesn't have any output at all testcase classname=junit.framework.TestSuite name=org.apache.solr.search.TestDocSet time=0.0 failure message=The test or suite printed 10982 bytes to stdout and stderr, even though the limit was set to 8192 bytes. Increase the limit with @Limit, ignore it completely with @SuppressSysoutChecks or run with -Dtests.verbose=true type=java.lang.AssertionErrorjava.lang.AssertionError: The test or suite printed 10982 bytes to stdout and stderr, even though the limit was set to 8192 bytes. Increase the limit with @Limit, ignore it completely with @SuppressSysoutChecks or run with -Dtests.verbose=true at __randomizedtesting.SeedInfo.seed([63638DD5324A94A2]:0) Looking at tests-report.txt though, perhaps it's just thread leaks from other tests? [15:51:04.358] OK 0.11s J1 | TestDocSet.testFilter 2 1365229 T1109 oahh.LeaseRenewer.run WARN Failed to renew lease for [DFSClient_NONMAPREDUCE_-144622376_992] for 1130 seconds. Will retry shortly ... java.net .ConnectException: Call From odin/127.0.1.1 to localhost:33373 failed on connection exception: java.net.ConnectException: Connection refused; For more details see : http://wiki.apache.org/hadoop/ConnectionRefused 2at sun.reflect.GeneratedConstructorAccessor232.newInstance(Unknown Source) 2at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2at java.lang.reflect.Constructor.newInstance(Constructor.java:408) 2at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783) 2at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730) 2at org.apache.hadoop.ipc.Client.call(Client.java:1410) 2at org.apache.hadoop.ipc.Client.call(Client.java:1359) 2at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) 2at com.sun.proxy.$Proxy42.renewLease(Unknown Source) 2at sun.reflect.GeneratedMethodAccessor52.invoke(Unknown Source) 2at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2at java.lang.reflect.Method.invoke(Method.java:483) 2at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186) 2at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) 2at com.sun.proxy.$Proxy42.renewLease(Unknown Source) 2at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.renewLease(ClientNamenodeProtocolTranslatorPB.java:519) 2at org.apache.hadoop.hdfs.DFSClient.renewLease(DFSClient.java:773) 2at org.apache.hadoop.hdfs.LeaseRenewer.renew(LeaseRenewer.java:417) 2at org.apache.hadoop.hdfs.LeaseRenewer.run(LeaseRenewer.java:442) 2at org.apache.hadoop.hdfs.LeaseRenewer.access$700(LeaseRenewer.java:71) 2at org.apache.hadoop.hdfs.LeaseRenewer$1.run(LeaseRenewer.java:298) 2at java.lang.Thread.run(Thread.java:745) 2 Caused by: java.net.ConnectException: Connection refused 2at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) 2at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) 2at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529) 2at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493) 2at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:601) 2at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:696) 2at org.apache.hadoop.ipc.Client$Connection.access$2700(Client.java:367) 2at org.apache.hadoop.ipc.Client.getConnection(Client.java:1458) 2at org.apache.hadoop.ipc.Client.call(Client.java:1377) 2... 16 more 2 -Yonik
[jira] [Resolved] (SOLR-7162) Remove unused SolrSortField interface
[ https://issues.apache.org/jira/browse/SOLR-7162?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Shalin Shekhar Mangar resolved SOLR-7162. - Resolution: Fixed Assignee: Shalin Shekhar Mangar Thanks Yonik for the history and to Connor for the patch! Remove unused SolrSortField interface - Key: SOLR-7162 URL: https://issues.apache.org/jira/browse/SOLR-7162 Project: Solr Issue Type: Task Reporter: Shalin Shekhar Mangar Assignee: Shalin Shekhar Mangar Priority: Trivial Fix For: Trunk, 5.1 Attachments: SOLR-7162.patch SortSortField is an unused interface. I can't find any uses in our project. It is also marked as lucene.experimental. Let's nuke it. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5879) Add auto-prefix terms to block tree terms dict
[ https://issues.apache.org/jira/browse/LUCENE-5879?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14372077#comment-14372077 ] Michael McCandless commented on LUCENE-5879: {quote} We cannot continue writing code in this way. Please let intersect take care of how to intersect and get this shit out of the Query. The default Terms.intersect() method can specialize the PREFIX case with a PrefixTermsEnum if it is faster. {quote} Can you maybe be more specific?I'm having trouble following exactly what you're objecting to. Terms.intersect default impl is already specializing to PrefixTermsEnum in the patch. You don't want the added ctor that takes a prefix term in CompiledAutomaton but you are OK with PREFIX/RANGE in CA.AUTOMATON_TYPE? If I 1) remove the added ctor that takes the prefix term in CA, and 2) fix PrefixQuery to subclass AutomatonQuery (meaning CA must autodetect when it receives a prefix automaton), would that address your concerns? Or something else...? I still wonder if just using AutomatonTermsEnum for prefix/range will be fine. Then we don't need PREFIX nor RANGE in CA.AUTOMATON_TYPE. I'll open a separate issue for this... Add auto-prefix terms to block tree terms dict -- Key: LUCENE-5879 URL: https://issues.apache.org/jira/browse/LUCENE-5879 Project: Lucene - Core Issue Type: New Feature Components: core/codecs Reporter: Michael McCandless Assignee: Michael McCandless Fix For: 5.0, Trunk Attachments: LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch This cool idea to generalize numeric/trie fields came from Adrien: Today, when we index a numeric field (LongField, etc.) we pre-compute (via NumericTokenStream) outside of indexer/codec which prefix terms should be indexed. But this can be inefficient: you set a static precisionStep, and always add those prefix terms regardless of how the terms in the field are actually distributed. Yet typically in real world applications the terms have a non-random distribution. So, it should be better if instead the terms dict decides where it makes sense to insert prefix terms, based on how dense the terms are in each region of term space. This way we can speed up query time for both term (e.g. infix suggester) and numeric ranges, and it should let us use less index space and get faster range queries. This would also mean that min/maxTerm for a numeric field would now be correct, vs today where the externally computed prefix terms are placed after the full precision terms, causing hairy code like NumericUtils.getMaxInt/Long. So optos like LUCENE-5860 become feasible. The terms dict can also do tricks not possible if you must live on top of its APIs, e.g. to handle the adversary/over-constrained case when a given prefix has too many terms following it but finer prefixes have too few (what block tree calls floor term blocks). -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-7162) Remove unused SolrSortField interface
[ https://issues.apache.org/jira/browse/SOLR-7162?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14371980#comment-14371980 ] ASF subversion and git services commented on SOLR-7162: --- Commit 1668133 from sha...@apache.org in branch 'dev/branches/branch_5x' [ https://svn.apache.org/r1668133 ] SOLR-7162: Remove unused SolrSortField interface Remove unused SolrSortField interface - Key: SOLR-7162 URL: https://issues.apache.org/jira/browse/SOLR-7162 Project: Solr Issue Type: Task Reporter: Shalin Shekhar Mangar Priority: Trivial Fix For: Trunk, 5.1 Attachments: SOLR-7162.patch SortSortField is an unused interface. I can't find any uses in our project. It is also marked as lucene.experimental. Let's nuke it. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-7162) Remove unused SolrSortField interface
[ https://issues.apache.org/jira/browse/SOLR-7162?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14371978#comment-14371978 ] ASF subversion and git services commented on SOLR-7162: --- Commit 1668132 from sha...@apache.org in branch 'dev/trunk' [ https://svn.apache.org/r1668132 ] SOLR-7162: Remove unused SolrSortField interface Remove unused SolrSortField interface - Key: SOLR-7162 URL: https://issues.apache.org/jira/browse/SOLR-7162 Project: Solr Issue Type: Task Reporter: Shalin Shekhar Mangar Priority: Trivial Fix For: Trunk, 5.1 Attachments: SOLR-7162.patch SortSortField is an unused interface. I can't find any uses in our project. It is also marked as lucene.experimental. Let's nuke it. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
Re: failure from excessive output
Looks related to https://issues.apache.org/jira/browse/SOLR-7092. - Mark http://about.me/markrmiller On Mar 20, 2015, at 4:17 PM, Yonik Seeley ysee...@gmail.com wrote: Just got a failure from a test that doesn't have any output at all testcase classname=junit.framework.TestSuite name=org.apache.solr.search.TestDocSet time=0.0 failure message=The test or suite printed 10982 bytes to stdout and stderr, even though the limit was set to 8192 bytes. Increase the limit with @Limit, ignore it completely with @SuppressSysoutChecks or run with -Dtests.verbose=true type=java.lang.AssertionErrorjava.lang.AssertionError: The test or suite printed 10982 bytes to stdout and stderr, even though the limit was set to 8192 bytes. Increase the limit with @Limit, ignore it completely with @SuppressSysoutChecks or run with -Dtests.verbose=true at __randomizedtesting.SeedInfo.seed([63638DD5324A94A2]:0) Looking at tests-report.txt though, perhaps it's just thread leaks from other tests? [15:51:04.358] OK 0.11s J1 | TestDocSet.testFilter 2 1365229 T1109 oahh.LeaseRenewer.run WARN Failed to renew lease for [DFSClient_NONMAPREDUCE_-144622376_992] for 1130 seconds. Will retry shortly ... java.net .ConnectException: Call From odin/127.0.1.1 to localhost:33373 failed on connection exception: java.net.ConnectException: Connection refused; For more details see : http://wiki.apache.org/hadoop/ConnectionRefused 2at sun.reflect.GeneratedConstructorAccessor232.newInstance(Unknown Source) 2at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2at java.lang.reflect.Constructor.newInstance(Constructor.java:408) 2at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783) 2at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730) 2at org.apache.hadoop.ipc.Client.call(Client.java:1410) 2at org.apache.hadoop.ipc.Client.call(Client.java:1359) 2at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) 2at com.sun.proxy.$Proxy42.renewLease(Unknown Source) 2at sun.reflect.GeneratedMethodAccessor52.invoke(Unknown Source) 2at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2at java.lang.reflect.Method.invoke(Method.java:483) 2at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186) 2at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) 2at com.sun.proxy.$Proxy42.renewLease(Unknown Source) 2at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.renewLease(ClientNamenodeProtocolTranslatorPB.java:519) 2at org.apache.hadoop.hdfs.DFSClient.renewLease(DFSClient.java:773) 2at org.apache.hadoop.hdfs.LeaseRenewer.renew(LeaseRenewer.java:417) 2at org.apache.hadoop.hdfs.LeaseRenewer.run(LeaseRenewer.java:442) 2at org.apache.hadoop.hdfs.LeaseRenewer.access$700(LeaseRenewer.java:71) 2at org.apache.hadoop.hdfs.LeaseRenewer$1.run(LeaseRenewer.java:298) 2at java.lang.Thread.run(Thread.java:745) 2 Caused by: java.net.ConnectException: Connection refused 2at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) 2at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) 2at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529) 2at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493) 2at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:601) 2at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:696) 2at org.apache.hadoop.ipc.Client$Connection.access$2700(Client.java:367) 2at org.apache.hadoop.ipc.Client.getConnection(Client.java:1458) 2at org.apache.hadoop.ipc.Client.call(Client.java:1377) 2... 16 more 2 -Yonik - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
failure from excessive output
Just got a failure from a test that doesn't have any output at all testcase classname=junit.framework.TestSuite name=org.apache.solr.search.TestDocSet time=0.0 failure message=The test or suite printed 10982 bytes to stdout and stderr, even though the limit was set to 8192 bytes. Increase the limit with @Limit, ignore it completely with @SuppressSysoutChecks or run with -Dtests.verbose=true type=java.lang.AssertionErrorjava.lang.AssertionError: The test or suite printed 10982 bytes to stdout and stderr, even though the limit was set to 8192 bytes. Increase the limit with @Limit, ignore it completely with @SuppressSysoutChecks or run with -Dtests.verbose=true at __randomizedtesting.SeedInfo.seed([63638DD5324A94A2]:0) Looking at tests-report.txt though, perhaps it's just thread leaks from other tests? [15:51:04.358] OK 0.11s J1 | TestDocSet.testFilter 2 1365229 T1109 oahh.LeaseRenewer.run WARN Failed to renew lease for [DFSClient_NONMAPREDUCE_-144622376_992] for 1130 seconds. Will retry shortly ... java.net .ConnectException: Call From odin/127.0.1.1 to localhost:33373 failed on connection exception: java.net.ConnectException: Connection refused; For more details see : http://wiki.apache.org/hadoop/ConnectionRefused 2at sun.reflect.GeneratedConstructorAccessor232.newInstance(Unknown Source) 2at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2at java.lang.reflect.Constructor.newInstance(Constructor.java:408) 2at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783) 2at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730) 2at org.apache.hadoop.ipc.Client.call(Client.java:1410) 2at org.apache.hadoop.ipc.Client.call(Client.java:1359) 2at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) 2at com.sun.proxy.$Proxy42.renewLease(Unknown Source) 2at sun.reflect.GeneratedMethodAccessor52.invoke(Unknown Source) 2at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2at java.lang.reflect.Method.invoke(Method.java:483) 2at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186) 2at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) 2at com.sun.proxy.$Proxy42.renewLease(Unknown Source) 2at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.renewLease(ClientNamenodeProtocolTranslatorPB.java:519) 2at org.apache.hadoop.hdfs.DFSClient.renewLease(DFSClient.java:773) 2at org.apache.hadoop.hdfs.LeaseRenewer.renew(LeaseRenewer.java:417) 2at org.apache.hadoop.hdfs.LeaseRenewer.run(LeaseRenewer.java:442) 2at org.apache.hadoop.hdfs.LeaseRenewer.access$700(LeaseRenewer.java:71) 2at org.apache.hadoop.hdfs.LeaseRenewer$1.run(LeaseRenewer.java:298) 2at java.lang.Thread.run(Thread.java:745) 2 Caused by: java.net.ConnectException: Connection refused 2at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) 2at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) 2at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529) 2at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493) 2at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:601) 2at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:696) 2at org.apache.hadoop.ipc.Client$Connection.access$2700(Client.java:367) 2at org.apache.hadoop.ipc.Client.getConnection(Client.java:1458) 2at org.apache.hadoop.ipc.Client.call(Client.java:1377) 2... 16 more 2 -Yonik - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
Re: failure from excessive output
SOLR-7215 : Date: Fri, 20 Mar 2015 16:17:45 -0400 : From: Yonik Seeley ysee...@gmail.com : Reply-To: dev@lucene.apache.org : To: Solr/Lucene Dev dev@lucene.apache.org : Subject: failure from excessive output : : Just got a failure from a test that doesn't have any output at all : :testcase classname=junit.framework.TestSuite : name=org.apache.solr.search.TestDocSet time=0.0 : : failure message=The test or suite printed 10982 bytes to : stdout and stderr, even though the limit was set to 8192 bytes. : Increase the limit with @Limit, ignore it completely with : @SuppressSysoutChecks or run with -Dtests.verbose=true : type=java.lang.AssertionErrorjava.lang.AssertionError: The test or : suite printed 10982 bytes to stdout and stderr, even though the limit : was set to 8192 bytes. Increase the limit with @Limit, ignore it : completely with @SuppressSysoutChecks or run with -Dtests.verbose=true : : at __randomizedtesting.SeedInfo.seed([63638DD5324A94A2]:0) : : : : : Looking at tests-report.txt though, perhaps it's just thread leaks : from other tests? : : : : [15:51:04.358] OK 0.11s J1 | TestDocSet.testFilter : : 2 1365229 T1109 oahh.LeaseRenewer.run WARN Failed to renew lease : for [DFSClient_NONMAPREDUCE_-144622376_992] for 1130 seconds. Will : retry shortly ... java.net : : .ConnectException: Call From odin/127.0.1.1 to localhost:33373 failed : on connection exception: java.net.ConnectException: Connection : refused; For more details see : : : http://wiki.apache.org/hadoop/ConnectionRefused : : 2at sun.reflect.GeneratedConstructorAccessor232.newInstance(Unknown : Source) : : 2at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) : : 2at java.lang.reflect.Constructor.newInstance(Constructor.java:408) : : 2at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783) : : 2at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730) : : 2at org.apache.hadoop.ipc.Client.call(Client.java:1410) : : 2at org.apache.hadoop.ipc.Client.call(Client.java:1359) : : 2at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) : : 2at com.sun.proxy.$Proxy42.renewLease(Unknown Source) : : 2at sun.reflect.GeneratedMethodAccessor52.invoke(Unknown Source) : : 2at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) : : 2at java.lang.reflect.Method.invoke(Method.java:483) : : 2at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186) : : 2at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) : : 2at com.sun.proxy.$Proxy42.renewLease(Unknown Source) : : 2at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.renewLease(ClientNamenodeProtocolTranslatorPB.java:519) : : 2at org.apache.hadoop.hdfs.DFSClient.renewLease(DFSClient.java:773) : : 2at org.apache.hadoop.hdfs.LeaseRenewer.renew(LeaseRenewer.java:417) : : 2at org.apache.hadoop.hdfs.LeaseRenewer.run(LeaseRenewer.java:442) : : 2at org.apache.hadoop.hdfs.LeaseRenewer.access$700(LeaseRenewer.java:71) : : 2at org.apache.hadoop.hdfs.LeaseRenewer$1.run(LeaseRenewer.java:298) : : 2at java.lang.Thread.run(Thread.java:745) : : 2 Caused by: java.net.ConnectException: Connection refused : : 2at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) : : 2at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) : : 2at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) : : 2at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529) : : 2at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493) : : 2at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:601) : : 2at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:696) : : 2at org.apache.hadoop.ipc.Client$Connection.access$2700(Client.java:367) : : 2at org.apache.hadoop.ipc.Client.getConnection(Client.java:1458) : : 2at org.apache.hadoop.ipc.Client.call(Client.java:1377) : : 2... 16 more : : 2 : : : : : : : -Yonik : : - : To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org : For additional commands, e-mail: dev-h...@lucene.apache.org : : -Hoss http://www.lucidworks.com/ - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-Tests-5.x-Java7 - Build # 2806 - Still Failing
Build: https://builds.apache.org/job/Lucene-Solr-Tests-5.x-Java7/2806/ 3 tests failed. FAILED: org.apache.solr.cloud.LeaderInitiatedRecoveryOnCommitTest.test Error Message: IOException occured when talking to server at: http://127.0.0.1:39871/c8n_1x3_commits_shard1_replica3 Stack Trace: org.apache.solr.client.solrj.SolrServerException: IOException occured when talking to server at: http://127.0.0.1:39871/c8n_1x3_commits_shard1_replica3 at __randomizedtesting.SeedInfo.seed([D7BD170DB3E16E4F:5FE928D71D1D03B7]:0) at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:598) at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:236) at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:228) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:135) at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:483) at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:464) at org.apache.solr.cloud.LeaderInitiatedRecoveryOnCommitTest.oneShardTest(LeaderInitiatedRecoveryOnCommitTest.java:130) at org.apache.solr.cloud.LeaderInitiatedRecoveryOnCommitTest.test(LeaderInitiatedRecoveryOnCommitTest.java:62) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:958) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:933) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at
[jira] [Commented] (SOLR-7191) Improve stability and startup performance of SolrCloud with thousands of collections
[ https://issues.apache.org/jira/browse/SOLR-7191?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14372155#comment-14372155 ] Shalin Shekhar Mangar commented on SOLR-7191: - That's awesome, Damien. I'll start next week on getting these improvements into Solr. I'm going to create some sub-tasks for individual changes. You can help by providing patches which applies on trunk. Improve stability and startup performance of SolrCloud with thousands of collections Key: SOLR-7191 URL: https://issues.apache.org/jira/browse/SOLR-7191 Project: Solr Issue Type: Bug Components: SolrCloud Affects Versions: 5.0 Reporter: Shawn Heisey Assignee: Shalin Shekhar Mangar Labels: performance, scalability Attachments: SOLR-7191.patch, SOLR-7191.patch, SOLR-7191.patch, lots-of-zkstatereader-updates-branch_5x.log A user on the mailing list with thousands of collections (5000 on 4.10.3, 4000 on 5.0) is having severe problems with getting Solr to restart. I tried as hard as I could to duplicate the user setup, but I ran into many problems myself even before I was able to get 4000 collections created on a 5.0 example cloud setup. Restarting Solr takes a very long time, and it is not very stable once it's up and running. This kind of setup is very much pushing the envelope on SolrCloud performance and scalability. It doesn't help that I'm running both Solr nodes on one machine (I started with 'bin/solr -e cloud') and that ZK is embedded. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (SOLR-7280) Load cores in sorted order and tweak coreLoadThread counts to improve cluster stability on restarts
[ https://issues.apache.org/jira/browse/SOLR-7280?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Shalin Shekhar Mangar updated SOLR-7280: Summary: Load cores in sorted order and tweak coreLoadThread counts to improve cluster stability on restarts (was: Add an overseer action to publish an entire node as 'down') Load cores in sorted order and tweak coreLoadThread counts to improve cluster stability on restarts --- Key: SOLR-7280 URL: https://issues.apache.org/jira/browse/SOLR-7280 Project: Solr Issue Type: Sub-task Components: SolrCloud Reporter: Shalin Shekhar Mangar Assignee: Shalin Shekhar Mangar Fix For: Trunk, 5.1 In SOLR-7191, Damien mentioned that by loading solr cores in a sorted order and tweaking some of the coreLoadThread counts, he was able to improve the stability of a cluster with thousands of collections. We should explore some of these changes and fold them into Solr. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5879) Add auto-prefix terms to block tree terms dict
[ https://issues.apache.org/jira/browse/LUCENE-5879?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14372159#comment-14372159 ] Robert Muir commented on LUCENE-5879: - Yes, everything you propose makes sense (especially the last point you make, that would be fantastic!) High level I just feel here that we have the second use case where codec can do special stuff with intersect and we should be removing these specializations in our code, and just be passing the structure to the codec. I do realize this is already messy in trunk, but I think we need to remove a lot of this complexity. At the very least I think PrefixQuery shouldn't be a backdoor automaton query :) Add auto-prefix terms to block tree terms dict -- Key: LUCENE-5879 URL: https://issues.apache.org/jira/browse/LUCENE-5879 Project: Lucene - Core Issue Type: New Feature Components: core/codecs Reporter: Michael McCandless Assignee: Michael McCandless Fix For: 5.0, Trunk Attachments: LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch, LUCENE-5879.patch This cool idea to generalize numeric/trie fields came from Adrien: Today, when we index a numeric field (LongField, etc.) we pre-compute (via NumericTokenStream) outside of indexer/codec which prefix terms should be indexed. But this can be inefficient: you set a static precisionStep, and always add those prefix terms regardless of how the terms in the field are actually distributed. Yet typically in real world applications the terms have a non-random distribution. So, it should be better if instead the terms dict decides where it makes sense to insert prefix terms, based on how dense the terms are in each region of term space. This way we can speed up query time for both term (e.g. infix suggester) and numeric ranges, and it should let us use less index space and get faster range queries. This would also mean that min/maxTerm for a numeric field would now be correct, vs today where the externally computed prefix terms are placed after the full precision terms, causing hairy code like NumericUtils.getMaxInt/Long. So optos like LUCENE-5860 become feasible. The terms dict can also do tricks not possible if you must live on top of its APIs, e.g. to handle the adversary/over-constrained case when a given prefix has too many terms following it but finer prefixes have too few (what block tree calls floor term blocks). -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Created] (SOLR-7280) Add an overseer action to publish an entire node as 'down'
Shalin Shekhar Mangar created SOLR-7280: --- Summary: Add an overseer action to publish an entire node as 'down' Key: SOLR-7280 URL: https://issues.apache.org/jira/browse/SOLR-7280 Project: Solr Issue Type: Sub-task Components: SolrCloud Reporter: Shalin Shekhar Mangar Assignee: Shalin Shekhar Mangar Fix For: Trunk, 5.1 In SOLR-7191, Damien mentioned that by loading solr cores in a sorted order and tweaking some of the coreLoadThread counts, he was able to improve the stability of a cluster with thousands of collections. We should explore some of these changes and fold them into Solr. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Resolved] (SOLR-7278) Make ValueSourceAugmenter easier to extend
[ https://issues.apache.org/jira/browse/SOLR-7278?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ryan McKinley resolved SOLR-7278. - Resolution: Fixed thanks for taking a look yonik Make ValueSourceAugmenter easier to extend -- Key: SOLR-7278 URL: https://issues.apache.org/jira/browse/SOLR-7278 Project: Solr Issue Type: Improvement Reporter: Ryan McKinley Assignee: Ryan McKinley Priority: Trivial Fix For: Trunk, 5.1 Attachments: SOLR-7278-ValueSourceAugmenter.patch Right now the ValueSourceAugmenter does some hairy work to get the Value and then applies the change do the SolrDocument inline. Lets move modifying the document to a protected function so subclasses can do something different -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-5.x-Linux (64bit/jdk1.7.0_76) - Build # 11864 - Still Failing!
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-5.x-Linux/11864/ Java: 64bit/jdk1.7.0_76 -XX:+UseCompressedOops -XX:+UseG1GC 4 tests failed. FAILED: org.apache.solr.cloud.SaslZkACLProviderTest.testSaslZkACLProvider Error Message: Could not get the port for ZooKeeper server Stack Trace: java.lang.RuntimeException: Could not get the port for ZooKeeper server at org.apache.solr.cloud.ZkTestServer.run(ZkTestServer.java:506) at org.apache.solr.cloud.SaslZkACLProviderTest$SaslZkTestServer.run(SaslZkACLProviderTest.java:225) at org.apache.solr.cloud.SaslZkACLProviderTest.setUp(SaslZkACLProviderTest.java:90) at sun.reflect.GeneratedMethodAccessor16.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:870) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:745) FAILED: org.apache.solr.schema.TestCloudManagedSchema.test Error Message: Could not get the port for ZooKeeper server Stack Trace: java.lang.RuntimeException: Could not get the port for ZooKeeper server at org.apache.solr.cloud.ZkTestServer.run(ZkTestServer.java:506) at org.apache.solr.cloud.AbstractDistribZkTestBase.distribSetUp(AbstractDistribZkTestBase.java:62) at
[jira] [Created] (SOLR-7281) Add an overseer action to publish an entire node as 'down'
Shalin Shekhar Mangar created SOLR-7281: --- Summary: Add an overseer action to publish an entire node as 'down' Key: SOLR-7281 URL: https://issues.apache.org/jira/browse/SOLR-7281 Project: Solr Issue Type: Sub-task Components: SolrCloud Reporter: Shalin Shekhar Mangar Assignee: Shalin Shekhar Mangar Fix For: Trunk, 5.1 A node restart currently iterates through each core and publishes an item to the Overseer queue to mark that core as 'down'. This is inefficient if each node has many cores and causes overseer to be overwhelmed with requests. We can publish a single 'down' status for the entire node and have the overseer do the rest. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Created] (SOLR-7282) Cache config or index schema objects by configset and share them across cores
Shalin Shekhar Mangar created SOLR-7282: --- Summary: Cache config or index schema objects by configset and share them across cores Key: SOLR-7282 URL: https://issues.apache.org/jira/browse/SOLR-7282 Project: Solr Issue Type: Sub-task Components: SolrCloud Reporter: Shalin Shekhar Mangar Assignee: Shalin Shekhar Mangar Fix For: Trunk, 5.1 Sharing schema and config objects has been known to improve startup performance when a large number of cores are on the same box (See http://wiki.apache.org/solr/LotsOfCores).Damien also saw improvements to cluster startup speed upon caching the index schema in SOLR-7191. Now that SolrCloud configuration is based on config sets in ZK, we should explore how we can minimize config/schema parsing for each core in a way that is compatible with the recent/planned changes in the config and schema APIs. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-Tests-5.x-Java7 - Build # 2803 - Still Failing
Build: https://builds.apache.org/job/Lucene-Solr-Tests-5.x-Java7/2803/ 3 tests failed. FAILED: org.apache.solr.cloud.LeaderInitiatedRecoveryOnCommitTest.test Error Message: IOException occured when talking to server at: http://127.0.0.1:59671/c8n_1x3_commits_shard1_replica3 Stack Trace: org.apache.solr.client.solrj.SolrServerException: IOException occured when talking to server at: http://127.0.0.1:59671/c8n_1x3_commits_shard1_replica3 at __randomizedtesting.SeedInfo.seed([81C6D4A132C2F7CD:992EB7B9C3E9A35]:0) at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:598) at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:236) at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:228) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:135) at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:483) at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:464) at org.apache.solr.cloud.LeaderInitiatedRecoveryOnCommitTest.oneShardTest(LeaderInitiatedRecoveryOnCommitTest.java:130) at org.apache.solr.cloud.LeaderInitiatedRecoveryOnCommitTest.test(LeaderInitiatedRecoveryOnCommitTest.java:62) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:958) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:933) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at
[jira] [Commented] (SOLR-7191) Improve stability and startup performance of SolrCloud with thousands of collections
[ https://issues.apache.org/jira/browse/SOLR-7191?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14372212#comment-14372212 ] Shalin Shekhar Mangar commented on SOLR-7191: - bq. The biggest problem I noticed is that any little change on the cluster (even creating a new collection) seems to cause a flood of ZkStateReader Updating data for X to ver NN messages. Each time it must update every single collection it has ... when that happens, it doesn't take an extreme amount of time, but I've noticed that it does it repeatedly, especially on node startup. [~elyograg] -- Sorry for not responding earlier. Yes, that is how it works right now. We didn't optimize this case because collections are created/deleted infrequently. However, this might be a problem when users has collections with statetFormat=1 and 2. Improve stability and startup performance of SolrCloud with thousands of collections Key: SOLR-7191 URL: https://issues.apache.org/jira/browse/SOLR-7191 Project: Solr Issue Type: Bug Components: SolrCloud Affects Versions: 5.0 Reporter: Shawn Heisey Assignee: Shalin Shekhar Mangar Labels: performance, scalability Attachments: SOLR-7191.patch, SOLR-7191.patch, SOLR-7191.patch, lots-of-zkstatereader-updates-branch_5x.log A user on the mailing list with thousands of collections (5000 on 4.10.3, 4000 on 5.0) is having severe problems with getting Solr to restart. I tried as hard as I could to duplicate the user setup, but I ran into many problems myself even before I was able to get 4000 collections created on a 5.0 example cloud setup. Restarting Solr takes a very long time, and it is not very stable once it's up and running. This kind of setup is very much pushing the envelope on SolrCloud performance and scalability. It doesn't help that I'm running both Solr nodes on one machine (I started with 'bin/solr -e cloud') and that ZK is embedded. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-7216) JSON Request API
[ https://issues.apache.org/jira/browse/SOLR-7216?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14372229#comment-14372229 ] ASF subversion and git services commented on SOLR-7216: --- Commit 1668168 from [~yo...@apache.org] in branch 'dev/trunk' [ https://svn.apache.org/r1668168 ] SOLR-7216: JSON request API JSON Request API Key: SOLR-7216 URL: https://issues.apache.org/jira/browse/SOLR-7216 Project: Solr Issue Type: New Feature Reporter: Yonik Seeley Attachments: SOLR-7216.patch hhttp://yonik.com/solr-json-request-api/ The drawbacks to only having a query-parameter API include: - Inherently un-structured, requiring unsightly parameters like f.facet_name.facet.range.start=5 - Inherently un-typed… everything is a string. - More difficult to decipher large requests. - Harder to programmatically create a request. - Impossible to validate. Because of the lack of structure, we don’t know the set of valid parameter and thus can’t do good error checking. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-7216) JSON Request API
[ https://issues.apache.org/jira/browse/SOLR-7216?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14372230#comment-14372230 ] ASF subversion and git services commented on SOLR-7216: --- Commit 1668170 from [~yo...@apache.org] in branch 'dev/branches/branch_5x' [ https://svn.apache.org/r1668170 ] SOLR-7216: JSON request API JSON Request API Key: SOLR-7216 URL: https://issues.apache.org/jira/browse/SOLR-7216 Project: Solr Issue Type: New Feature Reporter: Yonik Seeley Attachments: SOLR-7216.patch hhttp://yonik.com/solr-json-request-api/ The drawbacks to only having a query-parameter API include: - Inherently un-structured, requiring unsightly parameters like f.facet_name.facet.range.start=5 - Inherently un-typed… everything is a string. - More difficult to decipher large requests. - Harder to programmatically create a request. - Impossible to validate. Because of the lack of structure, we don’t know the set of valid parameter and thus can’t do good error checking. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-trunk-Windows (64bit/jdk1.8.0_40) - Build # 4570 - Failure!
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-trunk-Windows/4570/ Java: 64bit/jdk1.8.0_40 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC 2 tests failed. FAILED: junit.framework.TestSuite.org.apache.solr.core.TestSolrConfigHandler Error Message: Could not remove the following files (in the order of attempts): C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001\tempDir-010\collection1\conf\params.json: java.nio.file.FileSystemException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001\tempDir-010\collection1\conf\params.json: The process cannot access the file because it is being used by another process. C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001\tempDir-010\collection1\conf: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001\tempDir-010\collection1\conf C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001\tempDir-010\collection1: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001\tempDir-010\collection1 C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001\tempDir-010: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001\tempDir-010 C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001\tempDir-010: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001\tempDir-010 C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001 Stack Trace: java.io.IOException: Could not remove the following files (in the order of attempts): C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001\tempDir-010\collection1\conf\params.json: java.nio.file.FileSystemException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001\tempDir-010\collection1\conf\params.json: The process cannot access the file because it is being used by another process. C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001\tempDir-010\collection1\conf: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001\tempDir-010\collection1\conf C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001\tempDir-010\collection1: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001\tempDir-010\collection1 C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001\tempDir-010: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001\tempDir-010 C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001\tempDir-010: java.nio.file.DirectoryNotEmptyException: C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001\tempDir-010 C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\solr\build\solr-core\test\J1\temp\solr.core.TestSolrConfigHandler E60C17712B7412B-001: java.nio.file.DirectoryNotEmptyException:
[JENKINS] Lucene-Solr-trunk-Linux (64bit/jdk1.8.0_60-ea-b06) - Build # 12028 - Failure!
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-trunk-Linux/12028/ Java: 64bit/jdk1.8.0_60-ea-b06 -XX:+UseCompressedOops -XX:+UseParallelGC 2 tests failed. FAILED: org.apache.solr.cloud.ChaosMonkeyNothingIsSafeTest.test Error Message: There were too many update fails - we expect it can happen, but shouldn't easily Stack Trace: java.lang.AssertionError: There were too many update fails - we expect it can happen, but shouldn't easily at __randomizedtesting.SeedInfo.seed([C2BE47DB51D9879:847FDBA71BE1F581]:0) at org.junit.Assert.fail(Assert.java:93) at org.junit.Assert.assertTrue(Assert.java:43) at org.junit.Assert.assertFalse(Assert.java:68) at org.apache.solr.cloud.ChaosMonkeyNothingIsSafeTest.test(ChaosMonkeyNothingIsSafeTest.java:222) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:958) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:933) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)