[jira] [Commented] (SOLR-11598) ExportWriter should support sorting on more than 4 sort fields

2018-08-23 Thread Varun Thacker (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-11598?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16591180#comment-16591180
 ] 

Varun Thacker commented on SOLR-11598:
--

While trying to debug this test case I ran into a NPE caused from this Jira and 
was able to isolate it to a new test 
{code:java}
@Test
public void testSortingOnFieldWithNoValues() throws Exception {
  assertU(delQ("*:*"));
  assertU(commit());

  assertU(adoc("id","1"));
  assertU(commit());

  // 10 fields
  List fieldNames = new ArrayList<>(Arrays.asList("floatdv", "intdv", 
"stringdv", "longdv", "doubledv",
  "datedv", "booleandv", "field1_s_dv", "field2_i_p", "field3_l_p"));
  for (String sortField : fieldNames) {
String resp = h.query(req("q", "*:*", "qt", "/export", "fl", "id," + 
sortField, "sort", sortField + " desc"));
assertJsonEquals(resp, "{\n" +
"  \"responseHeader\":{\"status\":0},\n" +
"  \"response\":{\n" +
"\"numFound\":1,\n" +
"\"docs\":[{\n" +
"\"id\":\"1\"}]}}");
  }

}{code}
 

> ExportWriter should support sorting on more than 4 sort fields
> --
>
> Key: SOLR-11598
> URL: https://issues.apache.org/jira/browse/SOLR-11598
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: streaming expressions
>Reporter: Aroop
>Assignee: Varun Thacker
>Priority: Major
>  Labels: patch
> Fix For: master (8.0), 7.5
>
> Attachments: SOLR-11598-6_6-streamtests, SOLR-11598-6_6.patch, 
> SOLR-11598-master.patch, SOLR-11598.patch, SOLR-11598.patch, 
> SOLR-11598.patch, SOLR-11598.patch, SOLR-11598.patch, SOLR-11598.patch, 
> SOLR-11598.patch, SOLR-11598.patch, SOLR-11598.patch, SOLR-11598.patch, 
> SOLR-11598.patch, streaming-export reports.xlsx
>
>
> I am a user of Streaming and I am currently trying to use rollups on an 10 
> dimensional document.
> I am unable to get correct results on this query as I am bounded by the 
> limitation of the export handler which supports only 4 sort fields.
> I do not see why this needs to be the case, as it could very well be 10 or 20.
> My current needs would be satisfied with 10, but one would want to ask why 
> can't it be any decent integer n, beyond which we know performance degrades, 
> but even then it should be caveat emptor.
> [~varunthacker] 
> Code Link:
> https://github.com/apache/lucene-solr/blob/19db1df81a18e6eb2cce5be973bf2305d606a9f8/solr/core/src/java/org/apache/solr/handler/ExportWriter.java#L455
> Error
> null:java.io.IOException: A max of 4 sorts can be specified
>   at 
> org.apache.solr.handler.ExportWriter.getSortDoc(ExportWriter.java:452)
>   at org.apache.solr.handler.ExportWriter.writeDocs(ExportWriter.java:228)
>   at 
> org.apache.solr.handler.ExportWriter.lambda$null$1(ExportWriter.java:219)
>   at 
> org.apache.solr.common.util.JavaBinCodec.writeIterator(JavaBinCodec.java:664)
>   at 
> org.apache.solr.common.util.JavaBinCodec.writeKnownType(JavaBinCodec.java:333)
>   at 
> org.apache.solr.common.util.JavaBinCodec.writeVal(JavaBinCodec.java:223)
>   at org.apache.solr.common.util.JavaBinCodec$1.put(JavaBinCodec.java:394)
>   at 
> org.apache.solr.handler.ExportWriter.lambda$null$2(ExportWriter.java:219)
>   at 
> org.apache.solr.common.util.JavaBinCodec.writeMap(JavaBinCodec.java:437)
>   at 
> org.apache.solr.common.util.JavaBinCodec.writeKnownType(JavaBinCodec.java:354)
>   at 
> org.apache.solr.common.util.JavaBinCodec.writeVal(JavaBinCodec.java:223)
>   at org.apache.solr.common.util.JavaBinCodec$1.put(JavaBinCodec.java:394)
>   at 
> org.apache.solr.handler.ExportWriter.lambda$write$3(ExportWriter.java:217)
>   at 
> org.apache.solr.common.util.JavaBinCodec.writeMap(JavaBinCodec.java:437)
>   at org.apache.solr.handler.ExportWriter.write(ExportWriter.java:215)
>   at org.apache.solr.core.SolrCore$3.write(SolrCore.java:2601)
>   at 
> org.apache.solr.response.QueryResponseWriterUtil.writeQueryResponse(QueryResponseWriterUtil.java:49)
>   at 
> org.apache.solr.servlet.HttpSolrCall.writeResponse(HttpSolrCall.java:809)
>   at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:538)
>   at 
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:361)
>   at 
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:305)
>   at 
> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1691)
>   at 
> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582)
>   at 
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
>   at 
> 

[JENKINS] Lucene-Solr-SmokeRelease-7.x - Build # 298 - Failure

2018-08-23 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-SmokeRelease-7.x/298/

No tests ran.

Build Log:
[...truncated 23261 lines...]
[asciidoctor:convert] asciidoctor: ERROR: about-this-guide.adoc: line 1: 
invalid part, must have at least one section (e.g., chapter, appendix, etc.)
[asciidoctor:convert] asciidoctor: WARNING: simulations.adoc: line 25: section 
title out of sequence: expected level 3, got level 4
[asciidoctor:convert] asciidoctor: WARNING: simulations.adoc: line 89: section 
title out of sequence: expected level 3, got level 4
[asciidoctor:convert] asciidoctor: ERROR: solr-glossary.adoc: line 1: invalid 
part, must have at least one section (e.g., chapter, appendix, etc.)
[asciidoctor:convert] asciidoctor: WARNING: simulations.adoc: line 25: section 
title out of sequence: expected levels 0 or 1, got level 2
[asciidoctor:convert] asciidoctor: WARNING: simulations.adoc: line 89: section 
title out of sequence: expected levels 0 or 1, got level 2
 [java] Processed 2312 links (1864 relative) to 3144 anchors in 246 files
 [echo] Validated Links & Anchors via: 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/solr/build/solr-ref-guide/bare-bones-html/

-dist-changes:
 [copy] Copying 4 files to 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/solr/package/changes

-dist-keys:
  [get] Getting: http://home.apache.org/keys/group/lucene.asc
  [get] To: 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/solr/package/KEYS

package:

-unpack-solr-tgz:

-ensure-solr-tgz-exists:
[mkdir] Created dir: 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/solr/build/solr.tgz.unpacked
[untar] Expanding: 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/solr/package/solr-7.5.0.tgz
 into 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/solr/build/solr.tgz.unpacked

generate-maven-artifacts:

resolve:

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/top-level-ivy-settings.xml

resolve:

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its 

[JENKINS-EA] Lucene-Solr-BadApples-master-Linux (64bit/jdk-11-ea+25) - Build # 83 - Still Unstable!

2018-08-23 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-BadApples-master-Linux/83/
Java: 64bit/jdk-11-ea+25 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC

59 tests failed.
FAILED:  
org.apache.solr.client.solrj.io.stream.StreamDecoratorTest.testParallelComplementStream

Error Message:
java.util.concurrent.ExecutionException: java.io.IOException: --> 
https://127.0.0.1:37533/solr/collection1_collection_shard2_replica_n2/:java.util.concurrent.ExecutionException:
 java.io.IOException: params 
q=a_s:(setA+||+setAB)=id,a_s,a_i=a_i+asc,+a_s+asc=a_i=false

Stack Trace:
java.io.IOException: java.util.concurrent.ExecutionException: 
java.io.IOException: --> 
https://127.0.0.1:37533/solr/collection1_collection_shard2_replica_n2/:java.util.concurrent.ExecutionException:
 java.io.IOException: params 
q=a_s:(setA+||+setAB)=id,a_s,a_i=a_i+asc,+a_s+asc=a_i=false
at 
__randomizedtesting.SeedInfo.seed([855B2A75E4B8E45F:B025A7584A8A0DC5]:0)
at 
org.apache.solr.client.solrj.io.stream.CloudSolrStream.openStreams(CloudSolrStream.java:400)
at 
org.apache.solr.client.solrj.io.stream.CloudSolrStream.open(CloudSolrStream.java:275)
at 
org.apache.solr.client.solrj.io.stream.StreamDecoratorTest.getTuples(StreamDecoratorTest.java:3953)
at 
org.apache.solr.client.solrj.io.stream.StreamDecoratorTest.testParallelComplementStream(StreamDecoratorTest.java:3941)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 

[jira] [Commented] (LUCENE-8465) Remove legacy auto-prefix logic from IntersectTermsEnum

2018-08-23 Thread David Smiley (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16591142#comment-16591142
 ] 

David Smiley commented on LUCENE-8465:
--

Nice.  If you're in the mood, you might also want to strip out most references 
to "auto-prefix" in javadocs/comments as well.  For example 
BlockTreeTermsReader lines 62-69.

> Remove legacy auto-prefix logic from IntersectTermsEnum
> ---
>
> Key: LUCENE-8465
> URL: https://issues.apache.org/jira/browse/LUCENE-8465
> Project: Lucene - Core
>  Issue Type: Task
>Reporter: Adrien Grand
>Priority: Minor
> Attachments: LUCENE-8465.patch
>
>
> We forgot to remove some logic related with auto-prefix terms from 
> IntersectTermsEnum.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-NightlyTests-7.x - Build # 300 - Still Unstable

2018-08-23 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-NightlyTests-7.x/300/

1 tests failed.
FAILED:  org.apache.solr.core.OpenCloseCoreStressTest.test10Minutes

Error Message:
Captured an uncaught exception in thread: Thread[id=4917, name=Lucene Merge 
Thread #1, state=RUNNABLE, group=TGRP-OpenCloseCoreStressTest]

Stack Trace:
com.carrotsearch.randomizedtesting.UncaughtExceptionError: Captured an uncaught 
exception in thread: Thread[id=4917, name=Lucene Merge Thread #1, 
state=RUNNABLE, group=TGRP-OpenCloseCoreStressTest]
Caused by: org.apache.lucene.index.MergePolicy$MergeException: 
java.nio.file.FileSystemException: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/checkout/solr/build/solr-core/test/J0/temp/solr.core.OpenCloseCoreStressTest_D3D0DA8B51BF1BEF-001/index-SimpleFSDirectory-016/_2m_Lucene50_0.doc:
 Too many open files
at __randomizedtesting.SeedInfo.seed([D3D0DA8B51BF1BEF]:0)
at 
org.apache.lucene.index.ConcurrentMergeScheduler.handleMergeException(ConcurrentMergeScheduler.java:704)
at 
org.apache.lucene.index.ConcurrentMergeScheduler$MergeThread.run(ConcurrentMergeScheduler.java:684)
Caused by: java.nio.file.FileSystemException: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/checkout/solr/build/solr-core/test/J0/temp/solr.core.OpenCloseCoreStressTest_D3D0DA8B51BF1BEF-001/index-SimpleFSDirectory-016/_2m_Lucene50_0.doc:
 Too many open files
at 
org.apache.lucene.mockfile.HandleLimitFS.onOpen(HandleLimitFS.java:48)
at 
org.apache.lucene.mockfile.HandleTrackingFS.callOpenHook(HandleTrackingFS.java:81)
at 
org.apache.lucene.mockfile.HandleTrackingFS.newOutputStream(HandleTrackingFS.java:160)
at java.nio.file.Files.newOutputStream(Files.java:216)
at 
org.apache.lucene.store.FSDirectory$FSIndexOutput.(FSDirectory.java:411)
at 
org.apache.lucene.store.FSDirectory$FSIndexOutput.(FSDirectory.java:407)
at 
org.apache.lucene.store.FSDirectory.createOutput(FSDirectory.java:255)
at 
org.apache.lucene.store.FilterDirectory.createOutput(FilterDirectory.java:74)
at 
org.apache.lucene.store.LockValidatingDirectoryWrapper.createOutput(LockValidatingDirectoryWrapper.java:44)
at 
org.apache.lucene.index.ConcurrentMergeScheduler$1.createOutput(ConcurrentMergeScheduler.java:288)
at 
org.apache.lucene.store.TrackingDirectoryWrapper.createOutput(TrackingDirectoryWrapper.java:43)
at 
org.apache.lucene.codecs.lucene50.Lucene50PostingsWriter.(Lucene50PostingsWriter.java:105)
at 
org.apache.lucene.codecs.lucene50.Lucene50PostingsFormat.fieldsConsumer(Lucene50PostingsFormat.java:423)
at 
org.apache.lucene.codecs.perfield.PerFieldPostingsFormat$FieldsWriter.merge(PerFieldPostingsFormat.java:162)
at 
org.apache.lucene.index.SegmentMerger.mergeTerms(SegmentMerger.java:231)
at org.apache.lucene.index.SegmentMerger.merge(SegmentMerger.java:116)
at 
org.apache.lucene.index.IndexWriter.mergeMiddle(IndexWriter.java:4438)
at org.apache.lucene.index.IndexWriter.merge(IndexWriter.java:4060)
at 
org.apache.solr.update.SolrIndexWriter.merge(SolrIndexWriter.java:196)
at 
org.apache.lucene.index.ConcurrentMergeScheduler.doMerge(ConcurrentMergeScheduler.java:625)
at 
org.apache.lucene.index.ConcurrentMergeScheduler$MergeThread.run(ConcurrentMergeScheduler.java:662)




Build Log:
[...truncated 13212 lines...]
   [junit4] Suite: org.apache.solr.core.OpenCloseCoreStressTest
   [junit4]   2> 323971 INFO  
(SUITE-OpenCloseCoreStressTest-seed#[D3D0DA8B51BF1BEF]-worker) [] 
o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: 
test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom
   [junit4]   2> Creating dataDir: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/checkout/solr/build/solr-core/test/J0/temp/solr.core.OpenCloseCoreStressTest_D3D0DA8B51BF1BEF-001/init-core-data-001
   [junit4]   2> 323972 INFO  
(SUITE-OpenCloseCoreStressTest-seed#[D3D0DA8B51BF1BEF]-worker) [] 
o.a.s.SolrTestCaseJ4 Using PointFields (NUMERIC_POINTS_SYSPROP=true) 
w/NUMERIC_DOCVALUES_SYSPROP=true
   [junit4]   2> 323974 INFO  
(SUITE-OpenCloseCoreStressTest-seed#[D3D0DA8B51BF1BEF]-worker) [] 
o.a.s.SolrTestCaseJ4 Randomized ssl (true) and clientAuth (false) via: 
@org.apache.solr.util.RandomizeSSL(reason=, value=NaN, ssl=NaN, clientAuth=NaN)
   [junit4]   2> 323975 INFO  
(TEST-OpenCloseCoreStressTest.test15Seconds-seed#[D3D0DA8B51BF1BEF]) [] 
o.a.s.SolrTestCaseJ4 ###Starting test15Seconds
   [junit4]   2> 323988 INFO  
(TEST-OpenCloseCoreStressTest.test15Seconds-seed#[D3D0DA8B51BF1BEF]) [] 
o.e.j.s.Server jetty-9.4.11.v20180605; built: 2018-06-05T18:24:03.829Z; git: 
d5fc0523cfa96bfebfbda19606cad384d772f04c; jvm 1.8.0_172-b11
   [junit4]   2> 323989 INFO  
(TEST-OpenCloseCoreStressTest.test15Seconds-seed#[D3D0DA8B51BF1BEF]) [] 
o.e.j.s.session DefaultSessionIdManager workerName=node0
 

[jira] [Commented] (SOLR-11598) ExportWriter should support sorting on more than 4 sort fields

2018-08-23 Thread Varun Thacker (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-11598?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16591118#comment-16591118
 ] 

Varun Thacker commented on SOLR-11598:
--

Thanks Steve! Amrit and I are looking into it

> ExportWriter should support sorting on more than 4 sort fields
> --
>
> Key: SOLR-11598
> URL: https://issues.apache.org/jira/browse/SOLR-11598
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: streaming expressions
>Reporter: Aroop
>Assignee: Varun Thacker
>Priority: Major
>  Labels: patch
> Fix For: master (8.0), 7.5
>
> Attachments: SOLR-11598-6_6-streamtests, SOLR-11598-6_6.patch, 
> SOLR-11598-master.patch, SOLR-11598.patch, SOLR-11598.patch, 
> SOLR-11598.patch, SOLR-11598.patch, SOLR-11598.patch, SOLR-11598.patch, 
> SOLR-11598.patch, SOLR-11598.patch, SOLR-11598.patch, SOLR-11598.patch, 
> SOLR-11598.patch, streaming-export reports.xlsx
>
>
> I am a user of Streaming and I am currently trying to use rollups on an 10 
> dimensional document.
> I am unable to get correct results on this query as I am bounded by the 
> limitation of the export handler which supports only 4 sort fields.
> I do not see why this needs to be the case, as it could very well be 10 or 20.
> My current needs would be satisfied with 10, but one would want to ask why 
> can't it be any decent integer n, beyond which we know performance degrades, 
> but even then it should be caveat emptor.
> [~varunthacker] 
> Code Link:
> https://github.com/apache/lucene-solr/blob/19db1df81a18e6eb2cce5be973bf2305d606a9f8/solr/core/src/java/org/apache/solr/handler/ExportWriter.java#L455
> Error
> null:java.io.IOException: A max of 4 sorts can be specified
>   at 
> org.apache.solr.handler.ExportWriter.getSortDoc(ExportWriter.java:452)
>   at org.apache.solr.handler.ExportWriter.writeDocs(ExportWriter.java:228)
>   at 
> org.apache.solr.handler.ExportWriter.lambda$null$1(ExportWriter.java:219)
>   at 
> org.apache.solr.common.util.JavaBinCodec.writeIterator(JavaBinCodec.java:664)
>   at 
> org.apache.solr.common.util.JavaBinCodec.writeKnownType(JavaBinCodec.java:333)
>   at 
> org.apache.solr.common.util.JavaBinCodec.writeVal(JavaBinCodec.java:223)
>   at org.apache.solr.common.util.JavaBinCodec$1.put(JavaBinCodec.java:394)
>   at 
> org.apache.solr.handler.ExportWriter.lambda$null$2(ExportWriter.java:219)
>   at 
> org.apache.solr.common.util.JavaBinCodec.writeMap(JavaBinCodec.java:437)
>   at 
> org.apache.solr.common.util.JavaBinCodec.writeKnownType(JavaBinCodec.java:354)
>   at 
> org.apache.solr.common.util.JavaBinCodec.writeVal(JavaBinCodec.java:223)
>   at org.apache.solr.common.util.JavaBinCodec$1.put(JavaBinCodec.java:394)
>   at 
> org.apache.solr.handler.ExportWriter.lambda$write$3(ExportWriter.java:217)
>   at 
> org.apache.solr.common.util.JavaBinCodec.writeMap(JavaBinCodec.java:437)
>   at org.apache.solr.handler.ExportWriter.write(ExportWriter.java:215)
>   at org.apache.solr.core.SolrCore$3.write(SolrCore.java:2601)
>   at 
> org.apache.solr.response.QueryResponseWriterUtil.writeQueryResponse(QueryResponseWriterUtil.java:49)
>   at 
> org.apache.solr.servlet.HttpSolrCall.writeResponse(HttpSolrCall.java:809)
>   at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:538)
>   at 
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:361)
>   at 
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:305)
>   at 
> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1691)
>   at 
> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582)
>   at 
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
>   at 
> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
>   at 
> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226)
>   at 
> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)
>   at 
> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512)
>   at 
> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
>   at 
> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)
>   at 
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
>   at 
> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213)
>   at 
> org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119)
>   at 
> 

[JENKINS] Lucene-Solr-master-Linux (64bit/jdk-9.0.4) - Build # 22733 - Failure!

2018-08-23 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/22733/
Java: 64bit/jdk-9.0.4 -XX:-UseCompressedOops -XX:+UseParallelGC

All tests passed

Build Log:
[...truncated 14172 lines...]
   [junit4] JVM J2: stdout was not empty, see: 
/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/temp/junit4-J2-20180824_015307_60515658941903744397590.sysout
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) 
   [junit4] #
   [junit4] # A fatal error has been detected by the Java Runtime Environment:
   [junit4] #
   [junit4] #  SIGSEGV (0xb) at pc=0x7ff3a825bdc9, pid=26098, tid=26153
   [junit4] #
   [junit4] # JRE version: Java(TM) SE Runtime Environment (9.0+11) (build 
9.0.4+11)
   [junit4] # Java VM: Java HotSpot(TM) 64-Bit Server VM (9.0.4+11, mixed mode, 
tiered, parallel gc, linux-amd64)
   [junit4] # Problematic frame:
   [junit4] # V  [libjvm.so+0xc5adc9]  PhaseIdealLoop::split_up(Node*, Node*, 
Node*) [clone .part.40]+0x619
   [junit4] #
   [junit4] # No core dump will be written. Core dumps have been disabled. To 
enable core dumping, try "ulimit -c unlimited" before starting Java again
   [junit4] #
   [junit4] # An error report file with more information is saved as:
   [junit4] # 
/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J2/hs_err_pid26098.log
   [junit4] #
   [junit4] # Compiler replay data is saved as:
   [junit4] # 
/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J2/replay_pid26098.log
   [junit4] #
   [junit4] # If you would like to submit a bug report, please visit:
   [junit4] #   http://bugreport.java.com/bugreport/crash.jsp
   [junit4] #
   [junit4] <<< JVM J2: EOF 

[...truncated 1121 lines...]
   [junit4] ERROR: JVM J2 ended with an exception, command line: 
/home/jenkins/tools/java/64bit/jdk-9.0.4/bin/java -XX:-UseCompressedOops 
-XX:+UseParallelGC -XX:+HeapDumpOnOutOfMemoryError 
-XX:HeapDumpPath=/home/jenkins/workspace/Lucene-Solr-master-Linux/heapdumps -ea 
-esa --illegal-access=deny -Dtests.prefix=tests -Dtests.seed=E4061B05643A1767 
-Xmx512M -Dtests.iters= -Dtests.verbose=false -Dtests.infostream=false 
-Dtests.codec=random -Dtests.postingsformat=random 
-Dtests.docvaluesformat=random -Dtests.locale=random -Dtests.timezone=random 
-Dtests.directory=random -Dtests.linedocsfile=europarl.lines.txt.gz 
-Dtests.luceneMatchVersion=8.0.0 -Dtests.cleanthreads=perClass 
-Djava.util.logging.config.file=/home/jenkins/workspace/Lucene-Solr-master-Linux/lucene/tools/junit4/logging.properties
 -Dtests.nightly=false -Dtests.weekly=false -Dtests.monster=false 
-Dtests.slow=true -Dtests.asserts=true -Dtests.multiplier=3 -DtempDir=./temp 
-Djava.io.tmpdir=./temp 
-Djunit4.tempDir=/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/temp
 -Dcommon.dir=/home/jenkins/workspace/Lucene-Solr-master-Linux/lucene 
-Dclover.db.dir=/home/jenkins/workspace/Lucene-Solr-master-Linux/lucene/build/clover/db
 
-Djava.security.policy=/home/jenkins/workspace/Lucene-Solr-master-Linux/lucene/tools/junit4/solr-tests.policy
 -Dtests.LUCENE_VERSION=8.0.0 -Djetty.testMode=1 -Djetty.insecurerandom=1 
-Dsolr.directoryFactory=org.apache.solr.core.MockDirectoryFactory 
-Djava.awt.headless=true -Djdk.map.althashing.threshold=0 
-Dtests.src.home=/home/jenkins/workspace/Lucene-Solr-master-Linux 
-Djava.security.egd=file:/dev/./urandom 
-Djunit4.childvm.cwd=/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J2
 -Djunit4.childvm.id=2 -Djunit4.childvm.count=3 -Dfile.encoding=ISO-8859-1 
-Djava.security.manager=org.apache.lucene.util.TestSecurityManager 
-Dtests.filterstacks=true -Dtests.leaveTemporary=false -Dtests.badapples=false 
-classpath 

[JENKINS-EA] Lucene-Solr-7.x-Linux (64bit/jdk-11-ea+25) - Build # 2618 - Unstable!

2018-08-23 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.x-Linux/2618/
Java: 64bit/jdk-11-ea+25 -XX:-UseCompressedOops -XX:+UseParallelGC

7 tests failed.
FAILED:  junit.framework.TestSuite.org.apache.solr.schema.SchemaApiFailureTest

Error Message:


Stack Trace:
java.util.concurrent.TimeoutException
at __randomizedtesting.SeedInfo.seed([B499A73D9291A2CF]:0)
at 
org.apache.solr.common.cloud.ZkStateReader.waitForState(ZkStateReader.java:1469)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.waitForState(CloudSolrClient.java:450)
at 
org.apache.solr.schema.SchemaApiFailureTest.setupCluster(SchemaApiFailureTest.java:43)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:874)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.base/java.lang.Thread.run(Thread.java:834)


FAILED:  junit.framework.TestSuite.org.apache.solr.schema.SchemaApiFailureTest

Error Message:


Stack Trace:
java.util.concurrent.TimeoutException
at __randomizedtesting.SeedInfo.seed([B499A73D9291A2CF]:0)
at 
org.apache.solr.common.cloud.ZkStateReader.waitForState(ZkStateReader.java:1469)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.waitForState(CloudSolrClient.java:450)
at 
org.apache.solr.schema.SchemaApiFailureTest.setupCluster(SchemaApiFailureTest.java:43)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:874)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 

[jira] [Commented] (SOLR-11598) ExportWriter should support sorting on more than 4 sort fields

2018-08-23 Thread Steve Rowe (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-11598?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590974#comment-16590974
 ] 

Steve Rowe commented on SOLR-11598:
---

Reproducing failure from 
[https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/22732/]:

{noformat}
Checking out Revision 4368ad72d2ccbb40583fa7d2e55464c47e341f8b 
(refs/remotes/origin/master)
[...]
   [junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=TestExportWriter 
-Dtests.method=testMultipleSorts -Dtests.seed=FBDB2884CBB0889D 
-Dtests.multiplier=3 -Dtests.slow=true -Dtests.locale=fi 
-Dtests.timezone=Antarctica/Troll -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8
   [junit4] FAILURE 9.95s J2 | TestExportWriter.testMultipleSorts <<<
   [junit4]> Throwable #1: java.lang.AssertionError
   [junit4]>at 
__randomizedtesting.SeedInfo.seed([FBDB2884CBB0889D:D2E9E06DE2A595BB]:0)
   [junit4]>at 
org.apache.solr.handler.export.TestExportWriter.validateSort(TestExportWriter.java:697)
   [junit4]>at 
org.apache.solr.handler.export.TestExportWriter.testMultipleSorts(TestExportWriter.java:637)
   [junit4]>at java.lang.Thread.run(Thread.java:748)
[...]
   [junit4]   2> NOTE: test params are: codec=Asserting(Lucene80): 
{id=PostingsFormat(name=LuceneVarGapDocFreqInterval), field1_s_dv=FSTOrd50}, 
docValues:{number_i_p=DocValuesFormat(name=Asserting), 
field1_i_p=DocValuesFormat(name=Asserting), 
number_ls_ni_p=DocValuesFormat(name=Asserting), 
number_ls_t=DocValuesFormat(name=Asserting), 
number_ds_p=DocValuesFormat(name=Asserting), 
number_ds_t=DocValuesFormat(name=Asserting), 
booleandv=DocValuesFormat(name=Asserting), 
number_i_t=DocValuesFormat(name=Asserting), 
field2_i_p=DocValuesFormat(name=Lucene70), 
number_is_ni_p=DocValuesFormat(name=Lucene70), 
number_is_ni_t=DocValuesFormat(name=Lucene70), 
longdv=DocValuesFormat(name=Lucene70), 
number_ls_p=DocValuesFormat(name=Asserting), id=DocValuesFormat(name=Lucene70), 
field1_s_dv=DocValuesFormat(name=Lucene70), 
number_dts_p=DocValuesFormat(name=Asserting), 
stringdv=DocValuesFormat(name=Lucene70), 
longdv_m=DocValuesFormat(name=Asserting), 
number_ls_ni_t=DocValuesFormat(name=Asserting), 
number_is_t=DocValuesFormat(name=Direct), 
number_dts_t=DocValuesFormat(name=Asserting), 
doubledv_m=DocValuesFormat(name=Direct), 
number_is_p=DocValuesFormat(name=Direct), 
doubledv=DocValuesFormat(name=Lucene70), 
field4_i_p=DocValuesFormat(name=Direct), 
number_dts_ni_t=DocValuesFormat(name=Asserting), 
field3_i_p=DocValuesFormat(name=Lucene70), intdv=DocValuesFormat(name=Direct), 
floatdv=DocValuesFormat(name=Lucene70), 
number_fs_p=DocValuesFormat(name=Lucene70), 
number_f_t=DocValuesFormat(name=Direct), 
field6_i_p=DocValuesFormat(name=Lucene70), 
number_l_p=DocValuesFormat(name=Lucene70), 
number_dt_p=DocValuesFormat(name=Lucene70), 
number_f_p=DocValuesFormat(name=Direct), 
number_d_t=DocValuesFormat(name=Lucene70), 
number_dt_t=DocValuesFormat(name=Lucene70), 
int_is_t=DocValuesFormat(name=Lucene70), 
number_dts_ni_p=DocValuesFormat(name=Asserting), 
number_fs_t=DocValuesFormat(name=Lucene70), 
datedv_m=DocValuesFormat(name=Asserting), 
int_is_p=DocValuesFormat(name=Lucene70), 
number_l_t=DocValuesFormat(name=Lucene70), 
number_ds_ni_t=DocValuesFormat(name=Lucene70), 
stringdv_m=DocValuesFormat(name=Direct), 
intdv_m=DocValuesFormat(name=Lucene70), 
number_ds_ni_p=DocValuesFormat(name=Lucene70), 
number_d_ni_p=DocValuesFormat(name=Direct), 
field5_i_p=DocValuesFormat(name=Asserting), 
number_fs_ni_p=DocValuesFormat(name=Lucene70), 
number_d_ni_t=DocValuesFormat(name=Direct), 
number_fs_ni_t=DocValuesFormat(name=Lucene70), 
field8_i_p=DocValuesFormat(name=Direct), 
datedv=DocValuesFormat(name=Asserting), 
number_f_ni_t=DocValuesFormat(name=Lucene70), 
floatdv_m=DocValuesFormat(name=Asserting), 
number_l_ni_t=DocValuesFormat(name=Direct), 
number_i_ni_p=DocValuesFormat(name=Lucene70), 
number_dt_ni_p=DocValuesFormat(name=Direct), 
number_f_ni_p=DocValuesFormat(name=Lucene70), 
number_i_ni_t=DocValuesFormat(name=Lucene70), 
number_dt_ni_t=DocValuesFormat(name=Direct), 
number_d_p=DocValuesFormat(name=Lucene70), 
field3_l_p=DocValuesFormat(name=Lucene70), 
field7_i_p=DocValuesFormat(name=Lucene70), 
number_l_ni_p=DocValuesFormat(name=Direct)}, maxPointsInLeafNode=740, 
maxMBSortInHeap=7.377656711554688, 
sim=Asserting(org.apache.lucene.search.similarities.AssertingSimilarity@2e7b4e25),
 locale=fi, timezone=Antarctica/Troll
   [junit4]   2> NOTE: Linux 4.15.0-32-generic amd64/Oracle Corporation 
1.8.0_172 (64-bit)/cpus=8,threads=1,free=248750568,total=518979584
{noformat}

> ExportWriter should support sorting on more than 4 sort fields
> --
>
> Key: SOLR-11598
> URL: https://issues.apache.org/jira/browse/SOLR-11598
> Project: Solr
>  Issue Type: Improvement
>  

[jira] [Commented] (SOLR-12028) BadApple and AwaitsFix annotations usage

2018-08-23 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12028?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590961#comment-16590961
 ] 

ASF subversion and git services commented on SOLR-12028:


Commit b3daa6ce5f7bf74ef148a58e609af1c43a283f44 in lucene-solr's branch 
refs/heads/branch_7x from [~cp.erick...@gmail.com]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=b3daa6c ]

SOLR-12028: BadApple and AwaitsFix annotations usage

(cherry picked from commit aa10cb7802ca2f2e0159a84c180193db43ca7926)


> BadApple and AwaitsFix annotations usage
> 
>
> Key: SOLR-12028
> URL: https://issues.apache.org/jira/browse/SOLR-12028
> Project: Solr
>  Issue Type: Task
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Tests
>Reporter: Erick Erickson
>Assignee: Erick Erickson
>Priority: Major
> Attachments: SOLR-12016-buildsystem.patch, SOLR-12028-3-Mar.patch, 
> SOLR-12028-sysprops-reproduce.patch, SOLR-12028.patch, SOLR-12028.patch
>
>
> There's a long discussion of this topic at SOLR-12016. Here's a summary:
> - BadApple annotations are used for tests that intermittently fail, say < 30% 
> of the time. Tests that fail more often shold be moved to AwaitsFix. This is, 
> of course, a judgement call
> - AwaitsFix annotations are used for tests that, for some reason, the problem 
> can't be fixed immediately. Likely reasons are third-party dependencies, 
> extreme difficulty tracking down, dependency on another JIRA etc.
> Jenkins jobs will typically run with BadApple disabled to cut down on noise. 
> Periodically Jenkins jobs will be run with BadApples enabled so BadApple 
> tests won't be lost and reports can be generated. Tests that run with 
> BadApples disabled that fail require _immediate_ attention.
> The default for developers is that BadApple is enabled.
> If you are working on one of these tests and cannot get the test to fail 
> locally, it is perfectly acceptable to comment the annotation out. You should 
> let the dev list know that this is deliberate.
> This JIRA is a placeholder for BadApple tests to point to between the times 
> they're identified as BadApple and they're either fixed or changed to 
> AwaitsFix or assigned their own JIRA.
> I've assigned this to myself to track so I don't lose track of it. No one 
> person will fix all of these issues, this will be an ongoing technical debt 
> cleanup effort.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: benchmark drop for PrimaryKey

2018-08-23 Thread Michael Sokolov
I think the benchmarks need updating after LUCENE-8461. I got them working
again by replacing lucene70 with lucene80 everywhere except for the
DocValues formats, and adding the backward-codecs.jar to the benchmarks
build. I'm not sure that was really the right way to go about this? After
that I did try switching to use FST50 for this PKLookup benchmark (see
below), but it did not recover the lost perf.

diff --git a/src/python/nightlyBench.py b/src/python/nightlyBench.py
index b42fe84..5807e49 100644
--- a/src/python/nightlyBench.py
+++ b/src/python/nightlyBench.py
@@ -699,7 +699,7 @@ def run():
-  idFieldPostingsFormat='Lucene50',
+  idFieldPostingsFormat='FST50',


On Thu, Aug 23, 2018 at 5:52 PM Michael Sokolov  wrote:

> OK thanks. I guess this benchmark must be run on a large-enough index that
> it doesn't fit entirely in RAM already anyway? When I ran it locally using
> the vanilla benchmark instructions, I believe the generated index was quite
> small (wikimedium10k).  At any rate, I don't have any specific use case
> yet, just thinking about some possibilities related to primary key lookup
> and came across this anomaly. Perhaps at least it deserves an annotation on
> the benchmark graph.
>


[jira] [Commented] (SOLR-12028) BadApple and AwaitsFix annotations usage

2018-08-23 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12028?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590956#comment-16590956
 ] 

ASF subversion and git services commented on SOLR-12028:


Commit aa10cb7802ca2f2e0159a84c180193db43ca7926 in lucene-solr's branch 
refs/heads/master from [~cp.erick...@gmail.com]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=aa10cb7 ]

SOLR-12028: BadApple and AwaitsFix annotations usage


> BadApple and AwaitsFix annotations usage
> 
>
> Key: SOLR-12028
> URL: https://issues.apache.org/jira/browse/SOLR-12028
> Project: Solr
>  Issue Type: Task
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Tests
>Reporter: Erick Erickson
>Assignee: Erick Erickson
>Priority: Major
> Attachments: SOLR-12016-buildsystem.patch, SOLR-12028-3-Mar.patch, 
> SOLR-12028-sysprops-reproduce.patch, SOLR-12028.patch, SOLR-12028.patch
>
>
> There's a long discussion of this topic at SOLR-12016. Here's a summary:
> - BadApple annotations are used for tests that intermittently fail, say < 30% 
> of the time. Tests that fail more often shold be moved to AwaitsFix. This is, 
> of course, a judgement call
> - AwaitsFix annotations are used for tests that, for some reason, the problem 
> can't be fixed immediately. Likely reasons are third-party dependencies, 
> extreme difficulty tracking down, dependency on another JIRA etc.
> Jenkins jobs will typically run with BadApple disabled to cut down on noise. 
> Periodically Jenkins jobs will be run with BadApples enabled so BadApple 
> tests won't be lost and reports can be generated. Tests that run with 
> BadApples disabled that fail require _immediate_ attention.
> The default for developers is that BadApple is enabled.
> If you are working on one of these tests and cannot get the test to fail 
> locally, it is perfectly acceptable to comment the annotation out. You should 
> let the dev list know that this is deliberate.
> This JIRA is a placeholder for BadApple tests to point to between the times 
> they're identified as BadApple and they're either fixed or changed to 
> AwaitsFix or assigned their own JIRA.
> I've assigned this to myself to track so I don't lose track of it. No one 
> person will fix all of these issues, this will be an ongoing technical debt 
> cleanup effort.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-master-Linux (64bit/jdk1.8.0_172) - Build # 22732 - Still Unstable!

2018-08-23 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/22732/
Java: 64bit/jdk1.8.0_172 -XX:-UseCompressedOops -XX:+UseConcMarkSweepGC

4 tests failed.
FAILED:  org.apache.solr.handler.export.TestExportWriter.testMultipleSorts

Error Message:


Stack Trace:
java.lang.AssertionError
at 
__randomizedtesting.SeedInfo.seed([FBDB2884CBB0889D:D2E9E06DE2A595BB]:0)
at org.junit.Assert.fail(Assert.java:92)
at org.junit.Assert.assertTrue(Assert.java:43)
at org.junit.Assert.assertTrue(Assert.java:54)
at 
org.apache.solr.handler.export.TestExportWriter.validateSort(TestExportWriter.java:697)
at 
org.apache.solr.handler.export.TestExportWriter.testMultipleSorts(TestExportWriter.java:637)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.lang.Thread.run(Thread.java:748)


FAILED:  org.apache.solr.handler.export.TestExportWriter.testMultipleSorts

Error Message:


Stack Trace:
java.lang.AssertionError
at 

[jira] [Updated] (SOLR-12684) Document speed gotchas and partitionKeys usage for ParallelStream

2018-08-23 Thread Amrit Sarkar (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-12684?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Amrit Sarkar updated SOLR-12684:

Attachment: SOLR-12684.patch

> Document speed gotchas and partitionKeys usage for ParallelStream
> -
>
> Key: SOLR-12684
> URL: https://issues.apache.org/jira/browse/SOLR-12684
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Varun Thacker
>Assignee: Varun Thacker
>Priority: Major
> Attachments: SOLR-12684.patch, SOLR-12684.patch, SOLR-12684.patch, 
> SOLR-12684.patch
>
>
> The aim of this Jira is to beef up the ref guide around parallel stream
> There are two things I want to address:
>  
> Firstly usage of partitionKeys :
> This line in the ref guide indicates that parallel stream keys should always 
> be the same as the underlying sort criteria 
> {code:java}
> The parallel function maintains the sort order of the tuples returned by the 
> worker nodes, so the sort criteria of the parallel function must match up 
> with the sort order of the tuples returned by the workers.
> {code}
> But as discussed on SOLR-12635 , Joel provided an example
> {code:java}
> The hash partitioner just needs to send documents to the same worker node. 
> You could do that with just one partitioning key
> For example if you sort on year, month and day. You could partition on year 
> only and still be fine as long as there was enough different years to spread 
> the records around the worker nodes.{code}
> So we should make this more clear in the ref guide.
> Let's also document that specifying more than 4 partitionKeys will throw an 
> error after SOLR-12683
>  
> At this point the user will understand how to use partitonKeys . It's related 
> to the sort criteria but should not have all the sort fields 
>  
> We should now mention a trick where the user could warn up the hash queries 
> as they are always run on the whole document set ( irrespective of the filter 
> criterias )
> also users should only use parallel when the docs matching post filter 
> criterias is very large .  
> {code:java}
> 
> 
> :{!hash workers=6 worker=0} name="partitionKeys">myPartitionKey
> :{!hash workers=6 worker=1} name="partitionKeys">myPartitionKey
> :{!hash workers=6 worker=2} name="partitionKeys">myPartitionKey
> :{!hash workers=6 worker=3} name="partitionKeys">myPartitionKey
> :{!hash workers=6 worker=4} name="partitionKeys">myPartitionKey
> :{!hash workers=6 worker=5} name="partitionKeys">myPartitionKey
> 
> {code}
>   



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12684) Document speed gotchas and partitionKeys usage for ParallelStream

2018-08-23 Thread Amrit Sarkar (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12684?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590951#comment-16590951
 ] 

Amrit Sarkar commented on SOLR-12684:
-

In most of the examples listed (sources and decorators) w.r.t {{search}} 
stream, {{qt=\export}} is not explicitly set; and by default {{select}} handler 
is picked up for those expressions. Select handler exports first 10 rows only. 
For new developer / explorer of the feature, it is not obvious. Attaching patch 
with inclusion of {{qt=\export}} for all examples with {{search}} expression.

> Document speed gotchas and partitionKeys usage for ParallelStream
> -
>
> Key: SOLR-12684
> URL: https://issues.apache.org/jira/browse/SOLR-12684
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Varun Thacker
>Assignee: Varun Thacker
>Priority: Major
> Attachments: SOLR-12684.patch, SOLR-12684.patch, SOLR-12684.patch
>
>
> The aim of this Jira is to beef up the ref guide around parallel stream
> There are two things I want to address:
>  
> Firstly usage of partitionKeys :
> This line in the ref guide indicates that parallel stream keys should always 
> be the same as the underlying sort criteria 
> {code:java}
> The parallel function maintains the sort order of the tuples returned by the 
> worker nodes, so the sort criteria of the parallel function must match up 
> with the sort order of the tuples returned by the workers.
> {code}
> But as discussed on SOLR-12635 , Joel provided an example
> {code:java}
> The hash partitioner just needs to send documents to the same worker node. 
> You could do that with just one partitioning key
> For example if you sort on year, month and day. You could partition on year 
> only and still be fine as long as there was enough different years to spread 
> the records around the worker nodes.{code}
> So we should make this more clear in the ref guide.
> Let's also document that specifying more than 4 partitionKeys will throw an 
> error after SOLR-12683
>  
> At this point the user will understand how to use partitonKeys . It's related 
> to the sort criteria but should not have all the sort fields 
>  
> We should now mention a trick where the user could warn up the hash queries 
> as they are always run on the whole document set ( irrespective of the filter 
> criterias )
> also users should only use parallel when the docs matching post filter 
> criterias is very large .  
> {code:java}
> 
> 
> :{!hash workers=6 worker=0} name="partitionKeys">myPartitionKey
> :{!hash workers=6 worker=1} name="partitionKeys">myPartitionKey
> :{!hash workers=6 worker=2} name="partitionKeys">myPartitionKey
> :{!hash workers=6 worker=3} name="partitionKeys">myPartitionKey
> :{!hash workers=6 worker=4} name="partitionKeys">myPartitionKey
> :{!hash workers=6 worker=5} name="partitionKeys">myPartitionKey
> 
> {code}
>   



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-repro - Build # 1292 - Still Unstable

2018-08-23 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-repro/1292/

[...truncated 28 lines...]
[repro] Jenkins log URL: 
https://builds.apache.org/job/Lucene-Solr-BadApples-Tests-master/134/consoleText

[repro] Revision: 95cb7aa491f5659084852ec29f52cc90cd7ea35c

[repro] Repro line:  ant test  -Dtestcase=TestManagedSchema 
-Dtests.method=testAddWithSchemaCodecFactory -Dtests.seed=D190A6ABC8683CA5 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.badapples=true 
-Dtests.locale=de-DE -Dtests.timezone=Europe/Zurich -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestCodecSupport 
-Dtests.method=testDocValuesFormats -Dtests.seed=D190A6ABC8683CA5 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.badapples=true 
-Dtests.locale=th-TH-u-nu-thai-x-lvariant-TH 
-Dtests.timezone=America/Argentina/San_Luis -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestCodecSupport 
-Dtests.method=testCompressionModeDefault -Dtests.seed=D190A6ABC8683CA5 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.badapples=true 
-Dtests.locale=th-TH-u-nu-thai-x-lvariant-TH 
-Dtests.timezone=America/Argentina/San_Luis -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestCodecSupport 
-Dtests.method=testCompressionMode -Dtests.seed=D190A6ABC8683CA5 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.badapples=true 
-Dtests.locale=th-TH-u-nu-thai-x-lvariant-TH 
-Dtests.timezone=America/Argentina/San_Luis -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestCodecSupport 
-Dtests.method=testMixedCompressionMode -Dtests.seed=D190A6ABC8683CA5 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.badapples=true 
-Dtests.locale=th-TH-u-nu-thai-x-lvariant-TH 
-Dtests.timezone=America/Argentina/San_Luis -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestCodecSupport 
-Dtests.method=testDynamicFieldsDocValuesFormats -Dtests.seed=D190A6ABC8683CA5 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.badapples=true 
-Dtests.locale=th-TH-u-nu-thai-x-lvariant-TH 
-Dtests.timezone=America/Argentina/San_Luis -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] git rev-parse --abbrev-ref HEAD
[repro] git rev-parse HEAD
[repro] Initial local git branch/revision: 
4368ad72d2ccbb40583fa7d2e55464c47e341f8b
[repro] git fetch
[repro] git checkout 95cb7aa491f5659084852ec29f52cc90cd7ea35c

[...truncated 2 lines...]
[repro] git merge --ff-only

[...truncated 1 lines...]
[repro] ant clean

[...truncated 6 lines...]
[repro] Test suites by module:
[repro]solr/core
[repro]   TestCodecSupport
[repro]   TestManagedSchema
[repro] ant compile-test

[...truncated 3392 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=10 
-Dtests.class="*.TestCodecSupport|*.TestManagedSchema" 
-Dtests.showOutput=onerror  -Dtests.seed=D190A6ABC8683CA5 -Dtests.multiplier=2 
-Dtests.slow=true -Dtests.badapples=true 
-Dtests.locale=th-TH-u-nu-thai-x-lvariant-TH 
-Dtests.timezone=America/Argentina/San_Luis -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[...truncated 8013 lines...]
[repro] Setting last failure code to 256

[repro] Failures:
[repro]   4/5 failed: org.apache.solr.core.TestCodecSupport
[repro]   4/5 failed: org.apache.solr.schema.TestManagedSchema
[repro] git checkout 4368ad72d2ccbb40583fa7d2e55464c47e341f8b

[...truncated 2 lines...]
[repro] Exiting with code 256

[...truncated 5 lines...]

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[JENKINS] Lucene-Solr-repro - Build # 1290 - Still Unstable

2018-08-23 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-repro/1290/

[...truncated 28 lines...]
[repro] Jenkins log URL: 
https://builds.apache.org/job/Lucene-Solr-BadApples-Tests-7.x/139/consoleText

[repro] Revision: 03eba329b12c269002470a986fe8ee5c7281dba2

[repro] Repro line:  ant test  -Dtestcase=SearchRateTriggerIntegrationTest 
-Dtests.method=testBelowSearchRate -Dtests.seed=37EF563BAD4560D8 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.badapples=true -Dtests.locale=es 
-Dtests.timezone=America/Cordoba -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=HttpPartitionTest -Dtests.method=test 
-Dtests.seed=37EF563BAD4560D8 -Dtests.multiplier=2 -Dtests.slow=true 
-Dtests.badapples=true -Dtests.locale=lt-LT -Dtests.timezone=Europe/Gibraltar 
-Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=SolrRrdBackendFactoryTest 
-Dtests.method=testBasic -Dtests.seed=37EF563BAD4560D8 -Dtests.multiplier=2 
-Dtests.slow=true -Dtests.badapples=true -Dtests.locale=es-DO 
-Dtests.timezone=Indian/Mauritius -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestComputePlanAction 
-Dtests.method=testNodeLost -Dtests.seed=37EF563BAD4560D8 -Dtests.multiplier=2 
-Dtests.slow=true -Dtests.badapples=true -Dtests.locale=en-MT 
-Dtests.timezone=Pacific/Tahiti -Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=LeaderVoteWaitTimeoutTest 
-Dtests.method=testMostInSyncReplicasCanWinElection 
-Dtests.seed=37EF563BAD4560D8 -Dtests.multiplier=2 -Dtests.slow=true 
-Dtests.badapples=true -Dtests.locale=ru-RU -Dtests.timezone=Pacific/Chatham 
-Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=LeaderVoteWaitTimeoutTest 
-Dtests.method=basicTest -Dtests.seed=37EF563BAD4560D8 -Dtests.multiplier=2 
-Dtests.slow=true -Dtests.badapples=true -Dtests.locale=ru-RU 
-Dtests.timezone=Pacific/Chatham -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestCollectionsAPIViaSolrCloudCluster 
-Dtests.method=testCollectionCreateSearchDelete -Dtests.seed=37EF563BAD4560D8 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.badapples=true 
-Dtests.locale=is-IS -Dtests.timezone=America/Indiana/Vincennes 
-Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestTriggerIntegration 
-Dtests.method=testNodeMarkersRegistration -Dtests.seed=37EF563BAD4560D8 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.badapples=true -Dtests.locale=et 
-Dtests.timezone=Iceland -Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] git rev-parse --abbrev-ref HEAD
[repro] git rev-parse HEAD
[repro] Initial local git branch/revision: 
4368ad72d2ccbb40583fa7d2e55464c47e341f8b
[repro] git fetch
[repro] git checkout 03eba329b12c269002470a986fe8ee5c7281dba2

[...truncated 2 lines...]
[repro] git merge --ff-only

[...truncated 1 lines...]
[repro] ant clean

[...truncated 6 lines...]
[repro] Test suites by module:
[repro]solr/core
[repro]   SearchRateTriggerIntegrationTest
[repro]   LeaderVoteWaitTimeoutTest
[repro]   TestCollectionsAPIViaSolrCloudCluster
[repro]   TestComputePlanAction
[repro]   TestTriggerIntegration
[repro]   HttpPartitionTest
[repro]   SolrRrdBackendFactoryTest
[repro] ant compile-test

[...truncated 3408 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=35 
-Dtests.class="*.SearchRateTriggerIntegrationTest|*.LeaderVoteWaitTimeoutTest|*.TestCollectionsAPIViaSolrCloudCluster|*.TestComputePlanAction|*.TestTriggerIntegration|*.HttpPartitionTest|*.SolrRrdBackendFactoryTest"
 -Dtests.showOutput=onerror  -Dtests.seed=37EF563BAD4560D8 -Dtests.multiplier=2 
-Dtests.slow=true -Dtests.badapples=true -Dtests.locale=es 
-Dtests.timezone=America/Cordoba -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[...truncated 5755 lines...]
[repro] Setting last failure code to 256

[repro] Failures:
[repro]   0/5 failed: org.apache.solr.cloud.HttpPartitionTest
[repro]   0/5 failed: org.apache.solr.cloud.LeaderVoteWaitTimeoutTest
[repro]   0/5 failed: 
org.apache.solr.cloud.api.collections.TestCollectionsAPIViaSolrCloudCluster
[repro]   0/5 failed: 
org.apache.solr.cloud.autoscaling.sim.TestTriggerIntegration
[repro]   1/5 failed: 
org.apache.solr.cloud.autoscaling.SearchRateTriggerIntegrationTest
[repro]   1/5 failed: 
org.apache.solr.cloud.autoscaling.sim.TestComputePlanAction
[repro]   1/5 failed: org.apache.solr.metrics.rrd.SolrRrdBackendFactoryTest
[repro] git checkout 4368ad72d2ccbb40583fa7d2e55464c47e341f8b

[...truncated 2 lines...]
[repro] Exiting with code 256

[...truncated 5 lines...]

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[JENKINS] Lucene-Solr-repro - Build # 1291 - Unstable

2018-08-23 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-repro/1291/

[...truncated 28 lines...]
[repro] Jenkins log URL: 
https://builds.apache.org/job/Lucene-Solr-SmokeRelease-master/1105/consoleText

[repro] Revision: 025350ea12f648b8f5864a0ba6ef85ddff577a2a

[repro] Ant options: -DsmokeTestRelease.java9=/home/jenkins/tools/java/latest1.9
[repro] Repro line:  ant test  -Dtestcase=TestManagedSchema 
-Dtests.method=testAddWithSchemaCodecFactory -Dtests.seed=6E6A7BAA240147A 
-Dtests.multiplier=2 -Dtests.locale=es-GT -Dtests.timezone=Africa/Maseru 
-Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestCodecSupport 
-Dtests.method=testDynamicFieldsDocValuesFormats -Dtests.seed=6E6A7BAA240147A 
-Dtests.multiplier=2 -Dtests.locale=mt -Dtests.timezone=Etc/UTC 
-Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestCodecSupport 
-Dtests.method=testCompressionMode -Dtests.seed=6E6A7BAA240147A 
-Dtests.multiplier=2 -Dtests.locale=mt -Dtests.timezone=Etc/UTC 
-Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestCodecSupport 
-Dtests.method=testDocValuesFormats -Dtests.seed=6E6A7BAA240147A 
-Dtests.multiplier=2 -Dtests.locale=mt -Dtests.timezone=Etc/UTC 
-Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestCodecSupport 
-Dtests.method=testMixedCompressionMode -Dtests.seed=6E6A7BAA240147A 
-Dtests.multiplier=2 -Dtests.locale=mt -Dtests.timezone=Etc/UTC 
-Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestCodecSupport 
-Dtests.method=testCompressionModeDefault -Dtests.seed=6E6A7BAA240147A 
-Dtests.multiplier=2 -Dtests.locale=mt -Dtests.timezone=Etc/UTC 
-Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] git rev-parse --abbrev-ref HEAD
[repro] git rev-parse HEAD
[repro] Initial local git branch/revision: 
4368ad72d2ccbb40583fa7d2e55464c47e341f8b
[repro] git fetch
[repro] git checkout 025350ea12f648b8f5864a0ba6ef85ddff577a2a

[...truncated 2 lines...]
[repro] git merge --ff-only

[...truncated 1 lines...]
[repro] ant clean

[...truncated 6 lines...]
[repro] Test suites by module:
[repro]solr/core
[repro]   TestManagedSchema
[repro]   TestCodecSupport
[repro] ant compile-test

[...truncated 3392 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=10 
-Dtests.class="*.TestManagedSchema|*.TestCodecSupport" 
-Dtests.showOutput=onerror 
-DsmokeTestRelease.java9=/home/jenkins/tools/java/latest1.9 
-Dtests.seed=6E6A7BAA240147A -Dtests.multiplier=2 -Dtests.locale=es-GT 
-Dtests.timezone=Africa/Maseru -Dtests.asserts=true -Dtests.file.encoding=UTF-8

[...truncated 6037 lines...]
[repro] Setting last failure code to 256

[repro] Failures:
[repro]   3/5 failed: org.apache.solr.core.TestCodecSupport
[repro]   3/5 failed: org.apache.solr.schema.TestManagedSchema
[repro] git checkout 4368ad72d2ccbb40583fa7d2e55464c47e341f8b

[...truncated 2 lines...]
[repro] Exiting with code 256

[...truncated 6 lines...]

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[JENKINS] Lucene-Solr-repro - Build # 1289 - Still Unstable

2018-08-23 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-repro/1289/

[...truncated 35 lines...]
[repro] Jenkins log URL: 
https://builds.apache.org/job/Lucene-Solr-Tests-7.x/812/consoleText

[repro] Revision: 03eba329b12c269002470a986fe8ee5c7281dba2

[repro] Repro line:  ant test  -Dtestcase=IndexSizeTriggerTest 
-Dtests.method=testSplitIntegration -Dtests.seed=EA6F05E11A46B2D 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.locale=es-CL 
-Dtests.timezone=Libya -Dtests.asserts=true -Dtests.file.encoding=US-ASCII

[repro] git rev-parse --abbrev-ref HEAD
[repro] git rev-parse HEAD
[repro] Initial local git branch/revision: 
4368ad72d2ccbb40583fa7d2e55464c47e341f8b
[repro] git fetch
[repro] git checkout 03eba329b12c269002470a986fe8ee5c7281dba2

[...truncated 2 lines...]
[repro] git merge --ff-only

[...truncated 1 lines...]
[repro] ant clean

[...truncated 6 lines...]
[repro] Test suites by module:
[repro]solr/core
[repro]   IndexSizeTriggerTest
[repro] ant compile-test

[...truncated 3408 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=5 
-Dtests.class="*.IndexSizeTriggerTest" -Dtests.showOutput=onerror  
-Dtests.seed=EA6F05E11A46B2D -Dtests.multiplier=2 -Dtests.slow=true 
-Dtests.locale=es-CL -Dtests.timezone=Libya -Dtests.asserts=true 
-Dtests.file.encoding=US-ASCII

[...truncated 7390 lines...]
[repro] Setting last failure code to 256

[repro] Failures:
[repro]   3/5 failed: org.apache.solr.cloud.autoscaling.IndexSizeTriggerTest
[repro] git checkout 4368ad72d2ccbb40583fa7d2e55464c47e341f8b

[...truncated 2 lines...]
[repro] Exiting with code 256

[...truncated 5 lines...]

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[JENKINS] Lucene-Solr-Tests-master - Build # 2736 - Still Unstable

2018-08-23 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-Tests-master/2736/

8 tests failed.
FAILED:  org.apache.solr.cloud.AddReplicaTest.test

Error Message:
core_node6:{"core":"addreplicatest_coll_shard1_replica_n5","base_url":"https://127.0.0.1:43459/solr","node_name":"127.0.0.1:43459_solr","state":"active","type":"NRT","force_set_state":"false"}

Stack Trace:
java.lang.AssertionError: 
core_node6:{"core":"addreplicatest_coll_shard1_replica_n5","base_url":"https://127.0.0.1:43459/solr","node_name":"127.0.0.1:43459_solr","state":"active","type":"NRT","force_set_state":"false"}
at 
__randomizedtesting.SeedInfo.seed([761476B8D14AB296:FE4049627FB6DF6E]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.assertTrue(Assert.java:43)
at org.apache.solr.cloud.AddReplicaTest.test(AddReplicaTest.java:85)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.lang.Thread.run(Thread.java:748)


FAILED:  

[jira] [Commented] (SOLR-12572) Reuse fieldvalues computed while sorting at writing in ExportWriter

2018-08-23 Thread Lucene/Solr QA (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12572?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590852#comment-16590852
 ] 

Lucene/Solr QA commented on SOLR-12572:
---

| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
| {color:red}-1{color} | {color:red} patch {color} | {color:red}  0m  5s{color} 
| {color:red} SOLR-12572 does not apply to master. Rebase required? Wrong 
Branch? See 
https://wiki.apache.org/solr/HowToContribute#Creating_the_patch_file for help. 
{color} |
\\
\\
|| Subsystem || Report/Notes ||
| JIRA Issue | SOLR-12572 |
| JIRA Patch URL | 
https://issues.apache.org/jira/secure/attachment/12936848/SOLR-12572.patch |
| Console output | 
https://builds.apache.org/job/PreCommit-SOLR-Build/171/console |
| Powered by | Apache Yetus 0.7.0   http://yetus.apache.org |


This message was automatically generated.



> Reuse fieldvalues computed while sorting at writing in ExportWriter
> ---
>
> Key: SOLR-12572
> URL: https://issues.apache.org/jira/browse/SOLR-12572
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: streaming expressions
>Reporter: Amrit Sarkar
>Assignee: Varun Thacker
>Priority: Minor
> Attachments: SOLR-12572.patch, SOLR-12572.patch, SOLR-12572.patch, 
> SOLR-12572.patch, SOLR-12572.patch, SOLR-12572.patch, SOLR-12572.patch, 
> SOLR-12572.patch
>
>
> While exporting result through "/export" handler,
> {code:java}
> http://localhost:8983/solr/core_name/export?q=my-query=severity+desc,timestamp+desc=severity,timestamp,msg
> {code}
> Doc-values are sought for all the {{sort}} fields defined (in this example 
> 'severity, 'timestamp'). When we stream out docs we again make doc-value 
> seeks against the {{fl}} fields ('severity','timestamp','msg') . 
> In most common use-cases we have {{fl = sort}} fields, or atleast the sort 
> fields are subset of {{fl}} fields, so if we can *pre-collect* the values 
> while sorting it, we can reduce the doc-value seeks potentially bringing 
> *speed improvement*.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-BadApples-Tests-master - Build # 134 - Still Unstable

2018-08-23 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-BadApples-Tests-master/134/

6 tests failed.
FAILED:  org.apache.solr.core.TestCodecSupport.testDocValuesFormats

Error Message:
An SPI class of type org.apache.lucene.codecs.DocValuesFormat with name 
'Lucene80' does not exist.  You need to add the corresponding JAR file 
supporting this SPI to your classpath.  The current classpath supports the 
following names: [Asserting, Direct, Lucene70]

Stack Trace:
java.lang.IllegalArgumentException: An SPI class of type 
org.apache.lucene.codecs.DocValuesFormat with name 'Lucene80' does not exist.  
You need to add the corresponding JAR file supporting this SPI to your 
classpath.  The current classpath supports the following names: [Asserting, 
Direct, Lucene70]
at 
__randomizedtesting.SeedInfo.seed([D190A6ABC8683CA5:DDCB6E1920699C5A]:0)
at org.apache.lucene.util.NamedSPILoader.lookup(NamedSPILoader.java:116)
at 
org.apache.lucene.codecs.DocValuesFormat.forName(DocValuesFormat.java:108)
at 
org.apache.solr.core.SchemaCodecFactory$1.getDocValuesFormatForField(SchemaCodecFactory.java:112)
at 
org.apache.lucene.codecs.lucene80.Lucene80Codec$2.getDocValuesFormatForField(Lucene80Codec.java:73)
at 
org.apache.solr.core.TestCodecSupport.testDocValuesFormats(TestCodecSupport.java:63)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 

Re: benchmark drop for PrimaryKey

2018-08-23 Thread Michael Sokolov
OK thanks. I guess this benchmark must be run on a large-enough index that
it doesn't fit entirely in RAM already anyway? When I ran it locally using
the vanilla benchmark instructions, I believe the generated index was quite
small (wikimedium10k).  At any rate, I don't have any specific use case
yet, just thinking about some possibilities related to primary key lookup
and came across this anomaly. Perhaps at least it deserves an annotation on
the benchmark graph.


Re: benchmark drop for PrimaryKey

2018-08-23 Thread David Smiley
Switching to "FST50" ought to bring back much of the benefit of "Memory".

On Thu, Aug 23, 2018 at 5:15 PM Adrien Grand  wrote:

> The commit that caused this slowdown might be
> https://github.com/mikemccand/luceneutil/commit/1d8460f342f269c98047def9f9eb76213acae5d9
> .
>
> We don't have anything that performs as well anymore indeed, but I'm not
> sure this is a big deal. I would suspect that there were not many users of
> that postings format, one reason being that it was not supported in terms
> of backward compatibility (like any codec but the default one) and another
> reason being that it used a lot of RAM. In a number of cases, we try to
> fold benefits of alternative codecs in the default codec, for instance we
> used to have a "pulsing" postings format that could record postings in the
> terms dictionary in order to save one disk seek, and we ended up folding
> this feature into the default postings format by only enabling it on terms
> that have a document frequency of 1 and index_options=DOCS_ONLY, so that it
> would be always used with primary keys. For that postings format, it didn't
> really make sense as the way that it managed to be so much faster was by
> loading much more information in RAM, which we don't want to do with the
> default codec.
>
> Le jeu. 23 août 2018 à 22:40, Michael Sokolov  a
> écrit :
>
>> I happened to stumble across this chart
>> https://home.apache.org/~mikemccand/lucenebench/PKLookup.html showing a
>> pretty drastic drop in this benchmark on 5/13. I looked at the commits
>> between the previous run and this one and did some investigation, trying to
>> do some git bisect to find the problem using benchmarks as a test, but it
>> proved to be quite difficult due to a breaking change re: MemoryCodec that
>> also required corresponding changes in  benchmark code.
>>
>> In the end, I think removing MemoryCodec is what caused the drop in perf
>> here, based on this comment in benchmark code:
>>
>> '2011-06-26'
>>Switched to MemoryCodec for the primary-key 'id' field so that lookups
>> (either for PKLookup test or for deletions during reopen in the NRT test)
>> are fast, with no IO.  Also switched to NRTCachingDirectory for the NRT
>> test, so that small new segments are written only in RAM.
>>
>> I don't really understand the implications here beyond benchmarks, but it
>> does seem that perhaps some essential high-performing capability has been
>> lost?  Is there some equivalent thing remaining after MemoryCodec's removal
>> that can be used for primary keys?
>>
>> -Mike
>>
> --
Lucene/Solr Search Committer, Consultant, Developer, Author, Speaker
LinkedIn: http://linkedin.com/in/davidwsmiley | Book:
http://www.solrenterprisesearchserver.com


[jira] [Commented] (LUCENE-8461) Add a Lucene80Codec

2018-08-23 Thread Adrien Grand (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8461?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590831#comment-16590831
 ] 

Adrien Grand commented on LUCENE-8461:
--

Woops, thanks Steve!

> Add a Lucene80Codec
> ---
>
> Key: LUCENE-8461
> URL: https://issues.apache.org/jira/browse/LUCENE-8461
> Project: Lucene - Core
>  Issue Type: Improvement
>Reporter: Adrien Grand
>Priority: Minor
> Fix For: master (8.0)
>
> Attachments: LUCENE-8461.patch
>
>
> Even though things would work if we kept using the current Lucene70Codec, I'd 
> like to create a new Lucene80Codec in order to make reasoning about what code 
> can be removed easier when we remove support for Lucene 7.x and to also 
> highlight the fact that as af 8.0 postings record impacts in the skip lists.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: benchmark drop for PrimaryKey

2018-08-23 Thread Adrien Grand
The commit that caused this slowdown might be
https://github.com/mikemccand/luceneutil/commit/1d8460f342f269c98047def9f9eb76213acae5d9
.

We don't have anything that performs as well anymore indeed, but I'm not
sure this is a big deal. I would suspect that there were not many users of
that postings format, one reason being that it was not supported in terms
of backward compatibility (like any codec but the default one) and another
reason being that it used a lot of RAM. In a number of cases, we try to
fold benefits of alternative codecs in the default codec, for instance we
used to have a "pulsing" postings format that could record postings in the
terms dictionary in order to save one disk seek, and we ended up folding
this feature into the default postings format by only enabling it on terms
that have a document frequency of 1 and index_options=DOCS_ONLY, so that it
would be always used with primary keys. For that postings format, it didn't
really make sense as the way that it managed to be so much faster was by
loading much more information in RAM, which we don't want to do with the
default codec.

Le jeu. 23 août 2018 à 22:40, Michael Sokolov  a écrit :

> I happened to stumble across this chart
> https://home.apache.org/~mikemccand/lucenebench/PKLookup.html showing a
> pretty drastic drop in this benchmark on 5/13. I looked at the commits
> between the previous run and this one and did some investigation, trying to
> do some git bisect to find the problem using benchmarks as a test, but it
> proved to be quite difficult due to a breaking change re: MemoryCodec that
> also required corresponding changes in  benchmark code.
>
> In the end, I think removing MemoryCodec is what caused the drop in perf
> here, based on this comment in benchmark code:
>
> '2011-06-26'
>Switched to MemoryCodec for the primary-key 'id' field so that lookups
> (either for PKLookup test or for deletions during reopen in the NRT test)
> are fast, with no IO.  Also switched to NRTCachingDirectory for the NRT
> test, so that small new segments are written only in RAM.
>
> I don't really understand the implications here beyond benchmarks, but it
> does seem that perhaps some essential high-performing capability has been
> lost?  Is there some equivalent thing remaining after MemoryCodec's removal
> that can be used for primary keys?
>
> -Mike
>


[jira] [Commented] (LUCENE-8461) Add a Lucene80Codec

2018-08-23 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8461?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590814#comment-16590814
 ] 

ASF subversion and git services commented on LUCENE-8461:
-

Commit 4368ad72d2ccbb40583fa7d2e55464c47e341f8b in lucene-solr's branch 
refs/heads/master from [~jpountz]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=4368ad7 ]

LUCENE-8461: Fix test failure.


> Add a Lucene80Codec
> ---
>
> Key: LUCENE-8461
> URL: https://issues.apache.org/jira/browse/LUCENE-8461
> Project: Lucene - Core
>  Issue Type: Improvement
>Reporter: Adrien Grand
>Priority: Minor
> Fix For: master (8.0)
>
> Attachments: LUCENE-8461.patch
>
>
> Even though things would work if we kept using the current Lucene70Codec, I'd 
> like to create a new Lucene80Codec in order to make reasoning about what code 
> can be removed easier when we remove support for Lucene 7.x and to also 
> highlight the fact that as af 8.0 postings record impacts in the skip lists.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



LUCENE-765

2018-08-23 Thread Michael Sokolov
Can I interest someone in reviewing my patch for
https://issues.apache.org/jira/browse/LUCENE-765? It's additional javadoc
for in the index package

I was rooting around for some low-impact helpful thing to do here, and
found this on a list of "newdev" issues. It's fairly high-level but should
be helpful as a general introduction and has pointers to more detailed
class-level javadocs

-Mike


benchmark drop for PrimaryKey

2018-08-23 Thread Michael Sokolov
I happened to stumble across this chart
https://home.apache.org/~mikemccand/lucenebench/PKLookup.html showing a
pretty drastic drop in this benchmark on 5/13. I looked at the commits
between the previous run and this one and did some investigation, trying to
do some git bisect to find the problem using benchmarks as a test, but it
proved to be quite difficult due to a breaking change re: MemoryCodec that
also required corresponding changes in  benchmark code.

In the end, I think removing MemoryCodec is what caused the drop in perf
here, based on this comment in benchmark code:

'2011-06-26'
   Switched to MemoryCodec for the primary-key 'id' field so that lookups
(either for PKLookup test or for deletions during reopen in the NRT test)
are fast, with no IO.  Also switched to NRTCachingDirectory for the NRT
test, so that small new segments are written only in RAM.

I don't really understand the implications here beyond benchmarks, but it
does seem that perhaps some essential high-performing capability has been
lost?  Is there some equivalent thing remaining after MemoryCodec's removal
that can be used for primary keys?

-Mike


[jira] [Commented] (SOLR-11495) Reduce the list of which query parsers are loaded by default

2018-08-23 Thread JIRA


[ 
https://issues.apache.org/jira/browse/SOLR-11495?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590784#comment-16590784
 ] 

Jan Høydahl commented on SOLR-11495:


I think we should keep them enabled as is, including xmlparser, and instead 
focus on fixing security issues along the way as well as document how to 
disable qparsers in “taking Solr to production” chapter.

> Reduce the list of which query parsers are loaded by default
> 
>
> Key: SOLR-11495
> URL: https://issues.apache.org/jira/browse/SOLR-11495
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: query parsers
>Affects Versions: 7.0
>Reporter: Shawn Heisey
>Priority: Major
>
> Virtually all of the query parsers that Solr supports are enabled by default, 
> in a map created in QParserPlugin.java.
> To reduce the possible attack surface of a default Solr installation, I 
> believe that the list of default parsers should be limited to a small handful 
> of the full list that's available. I will discuss specific ideas for that 
> list in comments.
> I think the bar should be very high for admission to the default parser list. 
> That list should only include those that are most commonly used by the 
> community. Only the most common parsers will have had extensive review for 
> security issues.
> _Edit_: moved description from "Docs Text" field where it was initially added 
> mistakenly.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-master-Linux (32bit/jdk1.8.0_172) - Build # 22731 - Still Unstable!

2018-08-23 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/22731/
Java: 32bit/jdk1.8.0_172 -server -XX:+UseConcMarkSweepGC

43 tests failed.
FAILED:  org.apache.solr.cloud.UnloadDistributedZkTest.test

Error Message:
Timeout occured while waiting response from server at: 
http://127.0.0.1:34569/_/jv

Stack Trace:
org.apache.solr.client.solrj.SolrServerException: Timeout occured while waiting 
response from server at: http://127.0.0.1:34569/_/jv
at 
__randomizedtesting.SeedInfo.seed([5A9F539362A20559:D2CB6C49CC5E68A1]:0)
at 
org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:654)
at 
org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:255)
at 
org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:244)
at 
org.apache.solr.client.solrj.impl.LBHttpSolrClient.doRequest(LBHttpSolrClient.java:483)
at 
org.apache.solr.client.solrj.impl.LBHttpSolrClient.request(LBHttpSolrClient.java:413)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1109)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:886)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:819)
at 
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:194)
at 
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:211)
at 
org.apache.solr.cloud.UnloadDistributedZkTest.testCoreUnloadAndLeaders(UnloadDistributedZkTest.java:307)
at 
org.apache.solr.cloud.UnloadDistributedZkTest.test(UnloadDistributedZkTest.java:70)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1008)
at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:983)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 

[jira] [Commented] (LUCENE-8461) Add a Lucene80Codec

2018-08-23 Thread Steve Rowe (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8461?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590749#comment-16590749
 ] 

Steve Rowe commented on LUCENE-8461:


Failing tests from 
[https://builds.apache.org/job/Lucene-Solr-SmokeRelease-master/1105/]:

{noformat}
Checking out Revision 025350ea12f648b8f5864a0ba6ef85ddff577a2a 
(refs/remotes/origin/master)
[...]
   [smoker][junit4]   2> NOTE: reproduce with: ant test  
-Dtestcase=TestCodecSupport -Dtests.method=testDynamicFieldsDocValuesFormats 
-Dtests.seed=6E6A7BAA240147A -Dtests.multiplier=2 -Dtests.locale=mt 
-Dtests.timezone=Etc/UTC -Dtests.asserts=true -Dtests.file.encoding=UTF-8
   [smoker][junit4] ERROR   0.03s J0 | 
TestCodecSupport.testDynamicFieldsDocValuesFormats <<<
   [smoker][junit4]> Throwable #1: java.lang.IllegalArgumentException: 
An SPI class of type org.apache.lucene.codecs.DocValuesFormat with name 
'Lucene80' does not exist.  You need to add the corresponding JAR file 
supporting this SPI to your classpath.  The current classpath supports the 
following names: [Asserting, Direct, Lucene70]
   [smoker][junit4]>at 
__randomizedtesting.SeedInfo.seed([6E6A7BAA240147A:E635F97CB4DEA22E]:0)
   [smoker][junit4]>at 
org.apache.lucene.util.NamedSPILoader.lookup(NamedSPILoader.java:116)
   [smoker][junit4]>at 
org.apache.lucene.codecs.DocValuesFormat.forName(DocValuesFormat.java:108)
   [smoker][junit4]>at 
org.apache.solr.core.SchemaCodecFactory$1.getDocValuesFormatForField(SchemaCodecFactory.java:112)
   [smoker][junit4]>at 
org.apache.lucene.codecs.lucene80.Lucene80Codec$2.getDocValuesFormatForField(Lucene80Codec.java:73)
   [smoker][junit4]>at 
org.apache.solr.core.TestCodecSupport.testDynamicFieldsDocValuesFormats(TestCodecSupport.java:85)
   [smoker][junit4]>at java.lang.Thread.run(Thread.java:748)
[...]
   [smoker][junit4] ERROR   0.63s J2 | 
TestManagedSchema.testAddWithSchemaCodecFactory <<<
   [smoker][junit4]> Throwable #1: 
org.apache.solr.common.SolrException: An SPI class of type 
org.apache.lucene.codecs.DocValuesFormat with name 'Lucene80' does not exist.  
You need to add the corresponding JAR file supporting this SPI to your 
classpath.  The current classpath supports the following names: [Asserting, 
Direct, Lucene70]
   [smoker][junit4]>at 
__randomizedtesting.SeedInfo.seed([6E6A7BAA240147A:4F93581D2A6A1328]:0)
   [smoker][junit4]>at 
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:219)
   [smoker][junit4]>at 
org.apache.solr.core.SolrCore.execute(SolrCore.java:2541)
   [smoker][junit4]>at 
org.apache.solr.servlet.DirectSolrConnection.request(DirectSolrConnection.java:125)
   [smoker][junit4]>at 
org.apache.solr.util.TestHarness.update(TestHarness.java:284)
   [smoker][junit4]>at 
org.apache.solr.util.BaseTestHarness.checkUpdateStatus(BaseTestHarness.java:274)
   [smoker][junit4]>at 
org.apache.solr.util.BaseTestHarness.validateUpdate(BaseTestHarness.java:244)
   [smoker][junit4]>at 
org.apache.solr.SolrTestCaseJ4.checkUpdateU(SolrTestCaseJ4.java:864)
   [smoker][junit4]>at 
org.apache.solr.SolrTestCaseJ4.assertU(SolrTestCaseJ4.java:843)
   [smoker][junit4]>at 
org.apache.solr.SolrTestCaseJ4.assertU(SolrTestCaseJ4.java:837)
   [smoker][junit4]>at 
org.apache.solr.schema.TestManagedSchema.testAddWithSchemaCodecFactory(TestManagedSchema.java:382)
   [smoker][junit4]>at java.lang.Thread.run(Thread.java:748)
   [smoker][junit4]> Caused by: java.lang.IllegalArgumentException: An 
SPI class of type org.apache.lucene.codecs.DocValuesFormat with name 'Lucene80' 
does not exist.  You need to add the corresponding JAR file supporting this SPI 
to your classpath.  The current classpath supports the following names: 
[Asserting, Direct, Lucene70]
   [smoker][junit4]>at 
org.apache.lucene.util.NamedSPILoader.lookup(NamedSPILoader.java:116)
   [smoker][junit4]>at 
org.apache.lucene.codecs.DocValuesFormat.forName(DocValuesFormat.java:108)
   [smoker][junit4]>at 
org.apache.solr.core.SchemaCodecFactory$1.getDocValuesFormatForField(SchemaCodecFactory.java:112)
   [smoker][junit4]>at 
org.apache.lucene.codecs.lucene80.Lucene80Codec$2.getDocValuesFormatForField(Lucene80Codec.java:73)
   [smoker][junit4]>at 
org.apache.lucene.codecs.perfield.PerFieldDocValuesFormat$FieldsWriter.getInstance(PerFieldDocValuesFormat.java:168)
   [smoker][junit4]>at 
org.apache.lucene.codecs.perfield.PerFieldDocValuesFormat$FieldsWriter.addSortedField(PerFieldDocValuesFormat.java:119)
   [smoker][junit4]>at 
org.apache.lucene.index.SortedDocValuesWriter.flush(SortedDocValuesWriter.java:163)
   [smoker][junit4]>  

[JENKINS] Lucene-Solr-repro - Build # 1288 - Unstable

2018-08-23 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-repro/1288/

[...truncated 33 lines...]
[repro] Jenkins log URL: 
https://builds.apache.org/job/Lucene-Solr-Tests-master/2735/consoleText

[repro] Revision: 8cde1277ec7151bd6ab62950ac93cbdd6ff04d9f

[repro] Repro line:  ant test  -Dtestcase=SearchRateTriggerIntegrationTest 
-Dtests.method=testBelowSearchRate -Dtests.seed=91FBED301929B020 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.locale=ga-IE 
-Dtests.timezone=Pacific/Wallis -Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestLargeCluster 
-Dtests.method=testSearchRate -Dtests.seed=91FBED301929B020 
-Dtests.multiplier=2 -Dtests.slow=true 
-Dtests.locale=th-TH-u-nu-thai-x-lvariant-TH -Dtests.timezone=Pacific/Galapagos 
-Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] git rev-parse --abbrev-ref HEAD
[repro] git rev-parse HEAD
[repro] Initial local git branch/revision: 
95cb7aa491f5659084852ec29f52cc90cd7ea35c
[repro] git fetch
[repro] git checkout 8cde1277ec7151bd6ab62950ac93cbdd6ff04d9f

[...truncated 2 lines...]
[repro] git merge --ff-only

[...truncated 1 lines...]
[repro] ant clean

[...truncated 6 lines...]
[repro] Test suites by module:
[repro]solr/core
[repro]   TestLargeCluster
[repro]   SearchRateTriggerIntegrationTest
[repro] ant compile-test

[...truncated 3391 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=10 
-Dtests.class="*.TestLargeCluster|*.SearchRateTriggerIntegrationTest" 
-Dtests.showOutput=onerror  -Dtests.seed=91FBED301929B020 -Dtests.multiplier=2 
-Dtests.slow=true -Dtests.locale=th-TH-u-nu-thai-x-lvariant-TH 
-Dtests.timezone=Pacific/Galapagos -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[...truncated 137498 lines...]
[repro] Setting last failure code to 256

[repro] Failures:
[repro]   0/5 failed: 
org.apache.solr.cloud.autoscaling.SearchRateTriggerIntegrationTest
[repro]   1/5 failed: org.apache.solr.cloud.autoscaling.sim.TestLargeCluster
[repro] git checkout 95cb7aa491f5659084852ec29f52cc90cd7ea35c

[...truncated 2 lines...]
[repro] Exiting with code 256

[...truncated 5 lines...]

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[JENKINS] Lucene-Solr-SmokeRelease-master - Build # 1105 - Still Failing

2018-08-23 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-SmokeRelease-master/1105/

No tests ran.

Build Log:
[...truncated 23221 lines...]
[asciidoctor:convert] asciidoctor: ERROR: about-this-guide.adoc: line 1: 
invalid part, must have at least one section (e.g., chapter, appendix, etc.)
[asciidoctor:convert] asciidoctor: ERROR: solr-glossary.adoc: line 1: invalid 
part, must have at least one section (e.g., chapter, appendix, etc.)
 [java] Processed 2278 links (1830 relative) to 3141 anchors in 247 files
 [echo] Validated Links & Anchors via: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/build/solr-ref-guide/bare-bones-html/

-dist-changes:
 [copy] Copying 4 files to 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/package/changes

-dist-keys:
  [get] Getting: http://home.apache.org/keys/group/lucene.asc
  [get] To: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/package/KEYS

package:

-unpack-solr-tgz:

-ensure-solr-tgz-exists:
[mkdir] Created dir: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/build/solr.tgz.unpacked
[untar] Expanding: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/package/solr-8.0.0.tgz
 into 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/build/solr.tgz.unpacked

generate-maven-artifacts:

resolve:

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 

[JENKINS] Lucene-Solr-BadApples-Tests-7.x - Build # 139 - Still Unstable

2018-08-23 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-BadApples-Tests-7.x/139/

8 tests failed.
FAILED:  org.apache.solr.cloud.HttpPartitionTest.test

Error Message:
Didn't see all replicas for shard shard1 in c8n_1x2 come up within 9 ms! 
ClusterState: {   "collMinRf_1x3":{ "pullReplicas":"0", 
"replicationFactor":"0", "shards":{"shard1":{ 
"range":"8000-7fff", "state":"active", "replicas":{ 
  "core_node4":{ "core":"collMinRf_1x3_shard1_replica_t1",  
   "base_url":"http://127.0.0.1:37964;, 
"node_name":"127.0.0.1:37964_", "state":"active", 
"type":"TLOG", "leader":"true"},   "core_node5":{   
  "core":"collMinRf_1x3_shard1_replica_t2", 
"base_url":"http://127.0.0.1:43278;, 
"node_name":"127.0.0.1:43278_", "state":"active", 
"type":"TLOG"},   "core_node6":{ 
"core":"collMinRf_1x3_shard1_replica_t3", 
"base_url":"http://127.0.0.1:38337;, 
"node_name":"127.0.0.1:38337_", "state":"active", 
"type":"TLOG", "router":{"name":"compositeId"}, 
"maxShardsPerNode":"1", "autoAddReplicas":"false", "nrtReplicas":"0",   
  "tlogReplicas":"3"},   "collection1":{ "pullReplicas":"0", 
"replicationFactor":"1", "shards":{   "shard1":{ 
"range":"8000-", "state":"active", 
"replicas":{"core_node44":{ 
"core":"collection1_shard1_replica_t43", 
"base_url":"http://127.0.0.1:38337;, 
"node_name":"127.0.0.1:38337_", "state":"active", 
"type":"TLOG", "leader":"true"}}},   "shard2":{ 
"range":"0-7fff", "state":"active", "replicas":{   
"core_node42":{ "core":"collection1_shard2_replica_t41",
 "base_url":"http://127.0.0.1:37964;, 
"node_name":"127.0.0.1:37964_", "state":"active", 
"type":"TLOG", "leader":"true"},   "core_node46":{  
   "core":"collection1_shard2_replica_t45", 
"base_url":"http://127.0.0.1:43278;, 
"node_name":"127.0.0.1:43278_", "state":"active", 
"type":"TLOG", "router":{"name":"compositeId"}, 
"maxShardsPerNode":"1", "autoAddReplicas":"false", "nrtReplicas":"1",   
  "tlogReplicas":"0"},   "control_collection":{ "pullReplicas":"0", 
"replicationFactor":"1", "shards":{"shard1":{ 
"range":"8000-7fff", "state":"active", 
"replicas":{"core_node2":{ 
"core":"control_collection_shard1_replica_n1", 
"base_url":"http://127.0.0.1:46783;, 
"node_name":"127.0.0.1:46783_", "state":"active", 
"type":"NRT", "leader":"true", 
"router":{"name":"compositeId"}, "maxShardsPerNode":"1", 
"autoAddReplicas":"false", "nrtReplicas":"1", "tlogReplicas":"0"},   
"c8n_1x2":{ "pullReplicas":"0", "replicationFactor":"0", 
"shards":{"shard1":{ "range":"8000-7fff", 
"state":"active", "replicas":{   "core_node3":{ 
"core":"c8n_1x2_shard1_replica_t1", 
"base_url":"http://127.0.0.1:46783;, 
"node_name":"127.0.0.1:46783_", "state":"active", 
"type":"TLOG", "leader":"true"},   "core_node4":{   
  "core":"c8n_1x2_shard1_replica_t2", 
"base_url":"http://127.0.0.1:43278;, 
"node_name":"127.0.0.1:43278_", "state":"recovering", 
"type":"TLOG", "router":{"name":"compositeId"}, 
"maxShardsPerNode":"1", "autoAddReplicas":"false", "nrtReplicas":"0",   
  "tlogReplicas":"2"}}

Stack Trace:
java.lang.AssertionError: Didn't see all replicas for shard shard1 in c8n_1x2 
come up within 9 ms! ClusterState: {
  "collMinRf_1x3":{
"pullReplicas":"0",
"replicationFactor":"0",
"shards":{"shard1":{
"range":"8000-7fff",
"state":"active",
"replicas":{
  "core_node4":{
"core":"collMinRf_1x3_shard1_replica_t1",
"base_url":"http://127.0.0.1:37964;,
"node_name":"127.0.0.1:37964_",
"state":"active",
"type":"TLOG",
"leader":"true"},
  "core_node5":{
"core":"collMinRf_1x3_shard1_replica_t2",
"base_url":"http://127.0.0.1:43278;,
"node_name":"127.0.0.1:43278_",
"state":"active",
"type":"TLOG"},
  "core_node6":{
"core":"collMinRf_1x3_shard1_replica_t3",
"base_url":"http://127.0.0.1:38337;,
"node_name":"127.0.0.1:38337_",
"state":"active",
"type":"TLOG",
"router":{"name":"compositeId"},

[jira] [Resolved] (SOLR-12590) Improve Solr resource loader coverage in the ref guide

2018-08-23 Thread Steve Rowe (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-12590?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Steve Rowe resolved SOLR-12590.
---
   Resolution: Fixed
Fix Version/s: 7.5
   master (8.0)

Committed.  Thanks [~ctargett] and [~cpoerschke]!

> Improve Solr resource loader coverage in the ref guide
> --
>
> Key: SOLR-12590
> URL: https://issues.apache.org/jira/browse/SOLR-12590
> Project: Solr
>  Issue Type: Task
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: documentation
>Reporter: Steve Rowe
>Assignee: Steve Rowe
>Priority: Major
> Fix For: master (8.0), 7.5
>
> Attachments: SOLR-12590.patch, SOLR-12590.patch
>
>
> In SolrCloud, storing large resources (e.g. binary machine learned models) on 
> the local filesystem should be a viable alternative to increasing ZooKeeper's 
> max file size limit (1MB), but there are undocumented complications.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12590) Improve Solr resource loader coverage in the ref guide

2018-08-23 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12590?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590668#comment-16590668
 ] 

ASF subversion and git services commented on SOLR-12590:


Commit 95cb7aa491f5659084852ec29f52cc90cd7ea35c in lucene-solr's branch 
refs/heads/master from [~steve_rowe]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=95cb7aa ]

SOLR-12590: Improve Solr resource loader coverage in the ref guide


> Improve Solr resource loader coverage in the ref guide
> --
>
> Key: SOLR-12590
> URL: https://issues.apache.org/jira/browse/SOLR-12590
> Project: Solr
>  Issue Type: Task
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: documentation
>Reporter: Steve Rowe
>Assignee: Steve Rowe
>Priority: Major
> Attachments: SOLR-12590.patch, SOLR-12590.patch
>
>
> In SolrCloud, storing large resources (e.g. binary machine learned models) on 
> the local filesystem should be a viable alternative to increasing ZooKeeper's 
> max file size limit (1MB), but there are undocumented complications.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12590) Improve Solr resource loader coverage in the ref guide

2018-08-23 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12590?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590667#comment-16590667
 ] 

ASF subversion and git services commented on SOLR-12590:


Commit 523295666f4a7360f09a30cb006153f8b9c2f9bf in lucene-solr's branch 
refs/heads/branch_7x from [~steve_rowe]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=5232956 ]

SOLR-12590: Improve Solr resource loader coverage in the ref guide


> Improve Solr resource loader coverage in the ref guide
> --
>
> Key: SOLR-12590
> URL: https://issues.apache.org/jira/browse/SOLR-12590
> Project: Solr
>  Issue Type: Task
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: documentation
>Reporter: Steve Rowe
>Assignee: Steve Rowe
>Priority: Major
> Attachments: SOLR-12590.patch, SOLR-12590.patch
>
>
> In SolrCloud, storing large resources (e.g. binary machine learned models) on 
> the local filesystem should be a viable alternative to increasing ZooKeeper's 
> max file size limit (1MB), but there are undocumented complications.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10028) SegmentsInfoRequestHandlerTest.testSegmentInfosVersion fails in master

2018-08-23 Thread JIRA


[ 
https://issues.apache.org/jira/browse/SOLR-10028?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590616#comment-16590616
 ] 

Tomás Fernández Löbbe commented on SOLR-10028:
--

{quote}If/Since this would check that the correct amount of segments info is 
returned, could a further future test perhaps even be then to specifically test 
that the returned segment ids are correct?
{quote}
+1. I'll add that
{quote}If/Since testSegmentInfosData gets...
{quote}
The goal was to quickly assert that the `numSegments` calculated made sense. 
That said, I think the {{<=2}} is not right. I beasted this test with my 
modifications but I believe I'd removed that. I'll check again and fix if 
necessary

> SegmentsInfoRequestHandlerTest.testSegmentInfosVersion fails in master
> --
>
> Key: SOLR-10028
> URL: https://issues.apache.org/jira/browse/SOLR-10028
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Tomás Fernández Löbbe
>Priority: Minor
> Attachments: SOLR-10028-alternative.patch, SOLR-10028.patch
>
>
> Failed in Jenkins: 
> https://jenkins.thetaphi.de/job/Lucene-Solr-master-Solaris/1092/
> It reproduces consistently in my mac also with the latest master 
> (ca50e5b61c2d8bfb703169cea2fb0ab20fd24c6b):
> {code}
> ant test  -Dtestcase=SegmentsInfoRequestHandlerTest 
> -Dtests.method=testSegmentInfosVersion -Dtests.seed=619B9D838D6F1E29 
> -Dtests.slow=true -Dtests.locale=en-AU -Dtests.timezone=America/Manaus 
> -Dtests.asserts=true -Dtests.file.encoding=ISO-8859-1
> {code}
> There are similar failures in previous Jenkins builds since last month



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12572) Reuse fieldvalues computed while sorting at writing in ExportWriter

2018-08-23 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12572?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590613#comment-16590613
 ] 

ASF subversion and git services commented on SOLR-12572:


Commit dfd2801cd27ccc1e24179cc0ee5768a22bb2e64c in lucene-solr's branch 
refs/heads/master from [~varunthacker]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=dfd2801 ]

SOLR-12572: While exporting documents using the export writer, if a field is 
specified as a sort parameter and also in the fl (field list) parameter, we 
save on one doc-value lookup. This can bring performance improvements of 15% 
and upwards depending on how many fields are in common


> Reuse fieldvalues computed while sorting at writing in ExportWriter
> ---
>
> Key: SOLR-12572
> URL: https://issues.apache.org/jira/browse/SOLR-12572
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: streaming expressions
>Reporter: Amrit Sarkar
>Assignee: Varun Thacker
>Priority: Minor
> Attachments: SOLR-12572.patch, SOLR-12572.patch, SOLR-12572.patch, 
> SOLR-12572.patch, SOLR-12572.patch, SOLR-12572.patch, SOLR-12572.patch, 
> SOLR-12572.patch
>
>
> While exporting result through "/export" handler,
> {code:java}
> http://localhost:8983/solr/core_name/export?q=my-query=severity+desc,timestamp+desc=severity,timestamp,msg
> {code}
> Doc-values are sought for all the {{sort}} fields defined (in this example 
> 'severity, 'timestamp'). When we stream out docs we again make doc-value 
> seeks against the {{fl}} fields ('severity','timestamp','msg') . 
> In most common use-cases we have {{fl = sort}} fields, or atleast the sort 
> fields are subset of {{fl}} fields, so if we can *pre-collect* the values 
> while sorting it, we can reduce the doc-value seeks potentially bringing 
> *speed improvement*.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8465) Remove legacy auto-prefix logic from IntersectTermsEnum

2018-08-23 Thread Alan Woodward (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590611#comment-16590611
 ] 

Alan Woodward commented on LUCENE-8465:
---

+1

> Remove legacy auto-prefix logic from IntersectTermsEnum
> ---
>
> Key: LUCENE-8465
> URL: https://issues.apache.org/jira/browse/LUCENE-8465
> Project: Lucene - Core
>  Issue Type: Task
>Reporter: Adrien Grand
>Priority: Minor
> Attachments: LUCENE-8465.patch
>
>
> We forgot to remove some logic related with auto-prefix terms from 
> IntersectTermsEnum.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS-EA] Lucene-Solr-7.x-Linux (64bit/jdk-11-ea+25) - Build # 2616 - Still Unstable!

2018-08-23 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.x-Linux/2616/
Java: 64bit/jdk-11-ea+25 -XX:-UseCompressedOops -XX:+UseParallelGC

20 tests failed.
FAILED:  junit.framework.TestSuite.org.apache.solr.cloud.DocValuesNotIndexedTest

Error Message:
Collection not found: dv_coll

Stack Trace:
org.apache.solr.common.SolrException: Collection not found: dv_coll
at __randomizedtesting.SeedInfo.seed([1B9B6F3B866F840F]:0)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:853)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:819)
at 
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:194)
at 
org.apache.solr.cloud.DocValuesNotIndexedTest.createCluster(DocValuesNotIndexedTest.java:155)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:874)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.base/java.lang.Thread.run(Thread.java:834)


FAILED:  junit.framework.TestSuite.org.apache.solr.cloud.DocValuesNotIndexedTest

Error Message:
Collection not found: dv_coll

Stack Trace:
org.apache.solr.common.SolrException: Collection not found: dv_coll
at __randomizedtesting.SeedInfo.seed([1B9B6F3B866F840F]:0)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:853)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:819)
at 
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:194)
at 
org.apache.solr.cloud.DocValuesNotIndexedTest.createCluster(DocValuesNotIndexedTest.java:155)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:874)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 

[JENKINS] Lucene-Solr-Tests-7.x - Build # 812 - Still Unstable

2018-08-23 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-Tests-7.x/812/

1 tests failed.
FAILED:  
org.apache.solr.cloud.autoscaling.IndexSizeTriggerTest.testSplitIntegration

Error Message:
events: [CapturedEvent{timestamp=2331071263694816, stage=STARTED, 
actionName='null', event={   "id":"848173861f83eTc4l4k0hv33203zlto3ua1qgov",   
"source":"index_size_trigger",   "eventTime":2331064381077566,   
"eventType":"INDEXSIZE",   "properties":{ "__start__":1, 
"aboveSize":{"testSplitIntegration_collection":["{\"core_node1\":{\n
\"core\":\"testSplitIntegration_collection_shard1_replica_n1\",\n
\"shard\":\"shard1\",\n
\"collection\":\"testSplitIntegration_collection\",\n
\"node_name\":\"127.0.0.1:10001_solr\",\n\"type\":\"NRT\",\n
\"leader\":\"true\",\n\"SEARCHER.searcher.maxDoc\":14,\n
\"SEARCHER.searcher.deletedDocs\":0,\n\"INDEX.sizeInBytes\":17240,\n
\"SEARCHER.searcher.numDocs\":14,\n\"__bytes__\":17240,\n
\"__docs__\":14,\n\"violationType\":\"aboveDocs\",\n
\"state\":\"active\",\n\"INDEX.sizeInGB\":1.605600118637085E-5}}"]}, 
"belowSize":{}, "_enqueue_time_":2331070971138816, 
"requestedOps":["Op{action=SPLITSHARD, hints={COLL_SHARD=[{\n  
\"first\":\"testSplitIntegration_collection\",\n  
\"second\":\"shard1\"}]}}"]}}, context={}, config={   
"trigger":"index_size_trigger",   "stage":[ "STARTED", "ABORTED", 
"SUCCEEDED", "FAILED"],   "beforeAction":[ "compute_plan", 
"execute_plan"],   "afterAction":[ "compute_plan", "execute_plan"],   
"class":"org.apache.solr.cloud.autoscaling.IndexSizeTriggerTest$CapturingTriggerListener"},
 message='null'}, CapturedEvent{timestamp=233107190423, 
stage=BEFORE_ACTION, actionName='compute_plan', event={   
"id":"848173861f83eTc4l4k0hv33203zlto3ua1qgov",   
"source":"index_size_trigger",   "eventTime":2331064381077566,   
"eventType":"INDEXSIZE",   "properties":{ "__start__":1, 
"aboveSize":{"testSplitIntegration_collection":["{\"core_node1\":{\n
\"core\":\"testSplitIntegration_collection_shard1_replica_n1\",\n
\"shard\":\"shard1\",\n
\"collection\":\"testSplitIntegration_collection\",\n
\"node_name\":\"127.0.0.1:10001_solr\",\n\"type\":\"NRT\",\n
\"leader\":\"true\",\n\"SEARCHER.searcher.maxDoc\":14,\n
\"SEARCHER.searcher.deletedDocs\":0,\n\"INDEX.sizeInBytes\":17240,\n
\"SEARCHER.searcher.numDocs\":14,\n\"__bytes__\":17240,\n
\"__docs__\":14,\n\"violationType\":\"aboveDocs\",\n
\"state\":\"active\",\n\"INDEX.sizeInGB\":1.605600118637085E-5}}"]}, 
"belowSize":{}, "_enqueue_time_":2331070971138816, 
"requestedOps":["Op{action=SPLITSHARD, hints={COLL_SHARD=[{\n  
\"first\":\"testSplitIntegration_collection\",\n  
\"second\":\"shard1\"}]}}"]}}, 
context={properties.BEFORE_ACTION=[compute_plan], source=index_size_trigger}, 
config={   "trigger":"index_size_trigger",   "stage":[ "STARTED", 
"ABORTED", "SUCCEEDED", "FAILED"],   "beforeAction":[ 
"compute_plan", "execute_plan"],   "afterAction":[ "compute_plan", 
"execute_plan"],   
"class":"org.apache.solr.cloud.autoscaling.IndexSizeTriggerTest$CapturingTriggerListener"},
 message='null'}, CapturedEvent{timestamp=2331071981575216, stage=AFTER_ACTION, 
actionName='compute_plan', event={   
"id":"848173861f83eTc4l4k0hv33203zlto3ua1qgov",   
"source":"index_size_trigger",   "eventTime":2331064381077566,   
"eventType":"INDEXSIZE",   "properties":{ "__start__":1, 
"aboveSize":{"testSplitIntegration_collection":["{\"core_node1\":{\n
\"core\":\"testSplitIntegration_collection_shard1_replica_n1\",\n
\"shard\":\"shard1\",\n
\"collection\":\"testSplitIntegration_collection\",\n
\"node_name\":\"127.0.0.1:10001_solr\",\n\"type\":\"NRT\",\n
\"leader\":\"true\",\n\"SEARCHER.searcher.maxDoc\":14,\n
\"SEARCHER.searcher.deletedDocs\":0,\n\"INDEX.sizeInBytes\":17240,\n
\"SEARCHER.searcher.numDocs\":14,\n\"__bytes__\":17240,\n
\"__docs__\":14,\n\"violationType\":\"aboveDocs\",\n
\"state\":\"active\",\n\"INDEX.sizeInGB\":1.605600118637085E-5}}"]}, 
"belowSize":{}, "_enqueue_time_":2331070971138816, 
"requestedOps":["Op{action=SPLITSHARD, hints={COLL_SHARD=[{\n  
\"first\":\"testSplitIntegration_collection\",\n  
\"second\":\"shard1\"}]}}"]}}, 
context={properties.operations=[{class=org.apache.solr.client.solrj.request.CollectionAdminRequest$SplitShard,
 method=GET, params.action=SPLITSHARD, 
params.collection=testSplitIntegration_collection, params.shard=shard1}], 
properties.BEFORE_ACTION=[compute_plan], source=index_size_trigger, 
properties.AFTER_ACTION=[compute_plan]}, config={   
"trigger":"index_size_trigger",   "stage":[ "STARTED", "ABORTED", 
"SUCCEEDED", "FAILED"],   "beforeAction":[ "compute_plan", 
"execute_plan"],   "afterAction":[ "compute_plan", "execute_plan"],   

[jira] [Commented] (SOLR-12572) Reuse fieldvalues computed while sorting at writing in ExportWriter

2018-08-23 Thread Varun Thacker (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12572?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590493#comment-16590493
 ] 

Varun Thacker commented on SOLR-12572:
--

Final patch which makes some minor changes on top of Amrit's patch. 

Added a test, some assertions and a modified copy method ( which is more like a 
clone ) in DoubleValue, IntValue etc.

The numbers Amrit posted look great! Thanks for running these extensive 
benchmarks 

> Reuse fieldvalues computed while sorting at writing in ExportWriter
> ---
>
> Key: SOLR-12572
> URL: https://issues.apache.org/jira/browse/SOLR-12572
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: streaming expressions
>Reporter: Amrit Sarkar
>Assignee: Varun Thacker
>Priority: Minor
> Attachments: SOLR-12572.patch, SOLR-12572.patch, SOLR-12572.patch, 
> SOLR-12572.patch, SOLR-12572.patch, SOLR-12572.patch, SOLR-12572.patch, 
> SOLR-12572.patch
>
>
> While exporting result through "/export" handler,
> {code:java}
> http://localhost:8983/solr/core_name/export?q=my-query=severity+desc,timestamp+desc=severity,timestamp,msg
> {code}
> Doc-values are sought for all the {{sort}} fields defined (in this example 
> 'severity, 'timestamp'). When we stream out docs we again make doc-value 
> seeks against the {{fl}} fields ('severity','timestamp','msg') . 
> In most common use-cases we have {{fl = sort}} fields, or atleast the sort 
> fields are subset of {{fl}} fields, so if we can *pre-collect* the values 
> while sorting it, we can reduce the doc-value seeks potentially bringing 
> *speed improvement*.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-12572) Reuse fieldvalues computed while sorting at writing in ExportWriter

2018-08-23 Thread Varun Thacker (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-12572?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Varun Thacker updated SOLR-12572:
-
Attachment: SOLR-12572.patch

> Reuse fieldvalues computed while sorting at writing in ExportWriter
> ---
>
> Key: SOLR-12572
> URL: https://issues.apache.org/jira/browse/SOLR-12572
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: streaming expressions
>Reporter: Amrit Sarkar
>Assignee: Varun Thacker
>Priority: Minor
> Attachments: SOLR-12572.patch, SOLR-12572.patch, SOLR-12572.patch, 
> SOLR-12572.patch, SOLR-12572.patch, SOLR-12572.patch, SOLR-12572.patch, 
> SOLR-12572.patch
>
>
> While exporting result through "/export" handler,
> {code:java}
> http://localhost:8983/solr/core_name/export?q=my-query=severity+desc,timestamp+desc=severity,timestamp,msg
> {code}
> Doc-values are sought for all the {{sort}} fields defined (in this example 
> 'severity, 'timestamp'). When we stream out docs we again make doc-value 
> seeks against the {{fl}} fields ('severity','timestamp','msg') . 
> In most common use-cases we have {{fl = sort}} fields, or atleast the sort 
> fields are subset of {{fl}} fields, so if we can *pre-collect* the values 
> while sorting it, we can reduce the doc-value seeks potentially bringing 
> *speed improvement*.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-master-Linux (64bit/jdk-10) - Build # 22730 - Still Unstable!

2018-08-23 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/22730/
Java: 64bit/jdk-10 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC

18 tests failed.
FAILED:  org.apache.solr.core.TestCodecSupport.testMixedCompressionMode

Error Message:
An SPI class of type org.apache.lucene.codecs.DocValuesFormat with name 
'Lucene80' does not exist.  You need to add the corresponding JAR file 
supporting this SPI to your classpath.  The current classpath supports the 
following names: [Asserting, Direct, Lucene70]

Stack Trace:
org.apache.solr.common.SolrException: An SPI class of type 
org.apache.lucene.codecs.DocValuesFormat with name 'Lucene80' does not exist.  
You need to add the corresponding JAR file supporting this SPI to your 
classpath.  The current classpath supports the following names: [Asserting, 
Direct, Lucene70]
at 
__randomizedtesting.SeedInfo.seed([4D6570221AA8D6A6:931007465FCA2317]:0)
at 
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:219)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:2541)
at 
org.apache.solr.servlet.DirectSolrConnection.request(DirectSolrConnection.java:125)
at org.apache.solr.util.TestHarness.update(TestHarness.java:284)
at 
org.apache.solr.util.BaseTestHarness.checkUpdateStatus(BaseTestHarness.java:274)
at 
org.apache.solr.util.BaseTestHarness.validateUpdate(BaseTestHarness.java:244)
at org.apache.solr.SolrTestCaseJ4.checkUpdateU(SolrTestCaseJ4.java:864)
at org.apache.solr.SolrTestCaseJ4.assertU(SolrTestCaseJ4.java:843)
at org.apache.solr.SolrTestCaseJ4.assertU(SolrTestCaseJ4.java:837)
at 
org.apache.solr.core.TestCodecSupport.testMixedCompressionMode(TestCodecSupport.java:140)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 

[JENKINS] Lucene-Solr-repro - Build # 1286 - Unstable

2018-08-23 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-repro/1286/

[...truncated 28 lines...]
[repro] Jenkins log URL: 
https://builds.apache.org/job/Lucene-Solr-NightlyTests-7.x/299/consoleText

[repro] Revision: b5e79d0db0a884c454f8b6002cbabda5e82ee391

[repro] Ant options: -Dtests.multiplier=2 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
[repro] Repro line:  ant test  -Dtestcase=SoftAutoCommitTest 
-Dtests.method=testHardCommitWithinAndSoftCommitMaxTimeRapidAdds 
-Dtests.seed=52D521B628CD1D84 -Dtests.multiplier=2 -Dtests.nightly=true 
-Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=en-NZ -Dtests.timezone=America/Anguilla -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[repro] Repro line:  ant test  -Dtestcase=SearchRateTriggerIntegrationTest 
-Dtests.method=testBelowSearchRate -Dtests.seed=52D521B628CD1D84 
-Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=fr -Dtests.timezone=Turkey -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[repro] Repro line:  ant test  -Dtestcase=HdfsBasicDistributedZkTest 
-Dtests.method=test -Dtests.seed=52D521B628CD1D84 -Dtests.multiplier=2 
-Dtests.nightly=true -Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=fi -Dtests.timezone=Pacific/Wallis -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[repro] Repro line:  ant test  -Dtestcase=HdfsBasicDistributedZkTest 
-Dtests.seed=52D521B628CD1D84 -Dtests.multiplier=2 -Dtests.nightly=true 
-Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=fi -Dtests.timezone=Pacific/Wallis -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[repro] Repro line:  ant test  -Dtestcase=TestPolicy 
-Dtests.method=testWithCollectionMoveReplica -Dtests.seed=37C8C527A51666F1 
-Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=ja -Dtests.timezone=Europe/Saratov -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestPolicy 
-Dtests.method=testComputePlanAfterNodeAdded -Dtests.seed=37C8C527A51666F1 
-Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=ja -Dtests.timezone=Europe/Saratov -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestPolicy 
-Dtests.method=testCoresSuggestions -Dtests.seed=37C8C527A51666F1 
-Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=ja -Dtests.timezone=Europe/Saratov -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestPolicy 
-Dtests.method=testDiskSpaceHint -Dtests.seed=37C8C527A51666F1 
-Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=ja -Dtests.timezone=Europe/Saratov -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestPolicy 
-Dtests.method=testMultiReplicaPlacement -Dtests.seed=37C8C527A51666F1 
-Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=ja -Dtests.timezone=Europe/Saratov -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestPolicy 
-Dtests.method=testConditionsSort -Dtests.seed=37C8C527A51666F1 
-Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=ja -Dtests.timezone=Europe/Saratov -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestPolicy 
-Dtests.method=testPortSuggestions -Dtests.seed=37C8C527A51666F1 
-Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=ja -Dtests.timezone=Europe/Saratov -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  

[jira] [Updated] (LUCENE-8465) Remove legacy auto-prefix logic from IntersectTermsEnum

2018-08-23 Thread Adrien Grand (JIRA)


 [ 
https://issues.apache.org/jira/browse/LUCENE-8465?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Adrien Grand updated LUCENE-8465:
-
Attachment: LUCENE-8465.patch

> Remove legacy auto-prefix logic from IntersectTermsEnum
> ---
>
> Key: LUCENE-8465
> URL: https://issues.apache.org/jira/browse/LUCENE-8465
> Project: Lucene - Core
>  Issue Type: Task
>Reporter: Adrien Grand
>Priority: Minor
> Attachments: LUCENE-8465.patch
>
>
> We forgot to remove some logic related with auto-prefix terms from 
> IntersectTermsEnum.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8465) Remove legacy auto-prefix logic from IntersectTermsEnum

2018-08-23 Thread Adrien Grand (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590389#comment-16590389
 ] 

Adrien Grand commented on LUCENE-8465:
--

Here is a patch.

> Remove legacy auto-prefix logic from IntersectTermsEnum
> ---
>
> Key: LUCENE-8465
> URL: https://issues.apache.org/jira/browse/LUCENE-8465
> Project: Lucene - Core
>  Issue Type: Task
>Reporter: Adrien Grand
>Priority: Minor
> Attachments: LUCENE-8465.patch
>
>
> We forgot to remove some logic related with auto-prefix terms from 
> IntersectTermsEnum.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Created] (LUCENE-8465) Remove legacy auto-prefix logic from IntersectTermsEnum

2018-08-23 Thread Adrien Grand (JIRA)
Adrien Grand created LUCENE-8465:


 Summary: Remove legacy auto-prefix logic from IntersectTermsEnum
 Key: LUCENE-8465
 URL: https://issues.apache.org/jira/browse/LUCENE-8465
 Project: Lucene - Core
  Issue Type: Task
Reporter: Adrien Grand


We forgot to remove some logic related with auto-prefix terms from 
IntersectTermsEnum.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Resolved] (LUCENE-8461) Add a Lucene80Codec

2018-08-23 Thread Adrien Grand (JIRA)


 [ 
https://issues.apache.org/jira/browse/LUCENE-8461?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Adrien Grand resolved LUCENE-8461.
--
   Resolution: Fixed
Fix Version/s: master (8.0)

> Add a Lucene80Codec
> ---
>
> Key: LUCENE-8461
> URL: https://issues.apache.org/jira/browse/LUCENE-8461
> Project: Lucene - Core
>  Issue Type: Improvement
>Reporter: Adrien Grand
>Priority: Minor
> Fix For: master (8.0)
>
> Attachments: LUCENE-8461.patch
>
>
> Even though things would work if we kept using the current Lucene70Codec, I'd 
> like to create a new Lucene80Codec in order to make reasoning about what code 
> can be removed easier when we remove support for Lucene 7.x and to also 
> highlight the fact that as af 8.0 postings record impacts in the skip lists.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-Tests-master - Build # 2735 - Still Unstable

2018-08-23 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-Tests-master/2735/

2 tests failed.
FAILED:  
org.apache.solr.cloud.autoscaling.SearchRateTriggerIntegrationTest.testBelowSearchRate

Error Message:
[Op{action=DELETEREPLICA, hints={COLL_SHARD=[{   
"first":"belowRate_collection",   "second":"shard1"}], REPLICA=[core_node6]}}, 
Op{action=DELETEREPLICA, hints={COLL_SHARD=[{   "first":"belowRate_collection", 
  "second":"shard1"}], REPLICA=[core_node8]}}, Op{action=DELETEREPLICA, 
hints={COLL_SHARD=[{   "first":"belowRate_collection",   "second":"shard1"}], 
REPLICA=[core_node3]}}, Op{action=NONE, 
hints={SRC_NODE=[127.0.0.1:41528_solr]}}, Op{action=NONE, 
hints={SRC_NODE=[127.0.0.1:44071_solr]}}, Op{action=NONE, 
hints={SRC_NODE=[127.0.0.1:34396_solr]}}, Op{action=NONE, 
hints={SRC_NODE=[127.0.0.1:45318_solr]}}, Op{action=NONE, 
hints={SRC_NODE=[127.0.0.1:44972_solr]}}] expected:<7> but was:<8>

Stack Trace:
java.lang.AssertionError: [Op{action=DELETEREPLICA, hints={COLL_SHARD=[{
  "first":"belowRate_collection",
  "second":"shard1"}], REPLICA=[core_node6]}}, Op{action=DELETEREPLICA, 
hints={COLL_SHARD=[{
  "first":"belowRate_collection",
  "second":"shard1"}], REPLICA=[core_node8]}}, Op{action=DELETEREPLICA, 
hints={COLL_SHARD=[{
  "first":"belowRate_collection",
  "second":"shard1"}], REPLICA=[core_node3]}}, Op{action=NONE, 
hints={SRC_NODE=[127.0.0.1:41528_solr]}}, Op{action=NONE, 
hints={SRC_NODE=[127.0.0.1:44071_solr]}}, Op{action=NONE, 
hints={SRC_NODE=[127.0.0.1:34396_solr]}}, Op{action=NONE, 
hints={SRC_NODE=[127.0.0.1:45318_solr]}}, Op{action=NONE, 
hints={SRC_NODE=[127.0.0.1:44972_solr]}}] expected:<7> but was:<8>
at 
__randomizedtesting.SeedInfo.seed([91FBED301929B020:FB564B5041C13533]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.failNotEquals(Assert.java:647)
at org.junit.Assert.assertEquals(Assert.java:128)
at org.junit.Assert.assertEquals(Assert.java:472)
at 
org.apache.solr.cloud.autoscaling.SearchRateTriggerIntegrationTest.testBelowSearchRate(SearchRateTriggerIntegrationTest.java:398)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 

[jira] [Commented] (LUCENE-8461) Add a Lucene80Codec

2018-08-23 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8461?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590298#comment-16590298
 ] 

ASF subversion and git services commented on LUCENE-8461:
-

Commit 025350ea12f648b8f5864a0ba6ef85ddff577a2a in lucene-solr's branch 
refs/heads/master from [~jpountz]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=025350e ]

LUCENE-8461: Add Lucene80Codec.


> Add a Lucene80Codec
> ---
>
> Key: LUCENE-8461
> URL: https://issues.apache.org/jira/browse/LUCENE-8461
> Project: Lucene - Core
>  Issue Type: Improvement
>Reporter: Adrien Grand
>Priority: Minor
> Attachments: LUCENE-8461.patch
>
>
> Even though things would work if we kept using the current Lucene70Codec, I'd 
> like to create a new Lucene80Codec in order to make reasoning about what code 
> can be removed easier when we remove support for Lucene 7.x and to also 
> highlight the fact that as af 8.0 postings record impacts in the skip lists.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12519) Support Deeply Nested Docs In Child Documents Transformer

2018-08-23 Thread David Smiley (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12519?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590297#comment-16590297
 ] 

David Smiley commented on SOLR-12519:
-

Next time I suggest pushing a new commit to the PR so that it's easy for me to 
see what change you did. I used a diff tool on your latest patch file with the 
one I uploaded before. BTW your latest patch includes tons of changes unrelated 
to this issue.

Thinking out-lout I _think_ I like the check for the rootDocId needing to 
be in the parents bit filter. Was this an actual problem shown by the test? 
Could this happen "normally"? I suspect this could only happen if the user/app 
isn't following the rules of nested docs – like they updated a child doc by 
itself, or something like that. If I'm right, I think we should throw a helpful 
exception in this scenario to alert them they are doing something wrong.  Or 
perhaps this may happen if you do a query for docs that are not exclusively 
root docs, and you use the child transformer.  Yeah... but is that actually a 
problem?  I think we should be able to support that just fine.  In this case 
you want the child docs below the matching child doc being transformed and not 
it's parents.  Can we explicitly test/support this?  I think we should.

I think we can avoid the addedChildDocs flag by checking if 
"pendingParentPathsToChildren" is empty; no? Can you add a test triggering this 
scenario? I'm aware the _example_ tests caught it but those are more sanity 
checks on our shipped configs and/or SolrJ interaction, and not meant to be the 
canonical tests for specific Solr features.
{quote}I am having trouble finding the test seed
{quote}
No prob. I think this is the easiest way: Click the "test results" link in the 
table. On the page that shows in Jenkins, the failed tests show at the top as 
links. Click the link for the failing test. On the page that shows, you'll see 
a stack trace. The stack trace will nearly always contain the seed near the top 
like so: 
{{__randomizedtesting.SeedInfo.seed([BC02A7658A1C547C:CFD8B8FF0604237A]:0)}}

> Support Deeply Nested Docs In Child Documents Transformer
> -
>
> Key: SOLR-12519
> URL: https://issues.apache.org/jira/browse/SOLR-12519
> Project: Solr
>  Issue Type: Sub-task
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: mosh
>Priority: Major
> Attachments: SOLR-12519-fix-solrj-tests.patch, 
> SOLR-12519-no-commit.patch, SOLR-12519.patch
>
>  Time Spent: 23h 10m
>  Remaining Estimate: 0h
>
> As discussed in SOLR-12298, to make use of the meta-data fields in 
> SOLR-12441, there needs to be a smarter child document transformer, which 
> provides the ability to rebuild the original nested documents' structure.
>  In addition, I also propose the transformer will also have the ability to 
> bring only some of the original hierarchy, to prevent unnecessary block join 
> queries. e.g.
> {code}  {"a": "b", "c": [ {"e": "f"}, {"e": "g"} , {"h": "i"} ]} {code}
>  Incase my query is for all the children of "a:b", which contain the key "e" 
> in them, the query will be broken in to two parts:
>  1. The parent query "a:b"
>  2. The child query "e:*".
> If the only children flag is on, the transformer will return the following 
> documents:
>  {code}[ {"e": "f"}, {"e": "g"} ]{code}
> In case the flag was not turned on(perhaps the default state), the whole 
> document hierarchy will be returned, containing only the matching children:
> {code}{"a": "b", "c": [ {"e": "f"}, {"e": "g"} ]{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12121) JWT Authentication plugin

2018-08-23 Thread JIRA


[ 
https://issues.apache.org/jira/browse/SOLR-12121?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590261#comment-16590261
 ] 

Jan Høydahl commented on SOLR-12121:


So the next question then is whether and how the Principal will be passed on 
from the original {{HttpServletRequest}} so that it can be picked up from the 
{{process(HttpRequest request, HttpContext context)}} callback in the 
interceptor, in a different thread?

> JWT Authentication plugin
> -
>
> Key: SOLR-12121
> URL: https://issues.apache.org/jira/browse/SOLR-12121
> Project: Solr
>  Issue Type: New Feature
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Authentication
>Reporter: Jan Høydahl
>Assignee: Jan Høydahl
>Priority: Major
> Fix For: master (8.0), 7.5
>
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> A new Authentication plugin that will accept a [Json Web 
> Token|https://en.wikipedia.org/wiki/JSON_Web_Token] (JWT) in the 
> Authorization header and validate it by checking the cryptographic signature. 
> The plugin will not perform the authentication itself but assert that the 
> user was authenticated by the service that issued the JWT token.
> JWT defined a number of standard claims, and user principal can be fetched 
> from the {{sub}} (subject) claim and passed on to Solr. The plugin will 
> always check the {{exp}} (expiry) claim and optionally enforce checks on the 
> {{iss}} (issuer) and {{aud}} (audience) claims.
> The first version of the plugin will only support RSA signing keys and will 
> support fetching the public key of the issuer through a [Json Web 
> Key|https://tools.ietf.org/html/rfc7517] (JWK) file, either from a https URL 
> or from local file.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS-EA] Lucene-Solr-7.x-Linux (64bit/jdk-11-ea+25) - Build # 2615 - Still Unstable!

2018-08-23 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.x-Linux/2615/
Java: 64bit/jdk-11-ea+25 -XX:-UseCompressedOops -XX:+UseParallelGC

18 tests failed.
FAILED:  
junit.framework.TestSuite.org.apache.solr.handler.component.CustomHighlightComponentTest

Error Message:
Could not find collection:collection82

Stack Trace:
java.lang.AssertionError: Could not find collection:collection82
at __randomizedtesting.SeedInfo.seed([CBE5F5CCE049EC6E]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.assertTrue(Assert.java:43)
at org.junit.Assert.assertNotNull(Assert.java:526)
at 
org.apache.solr.cloud.AbstractDistribZkTestBase.waitForRecoveriesToFinish(AbstractDistribZkTestBase.java:155)
at 
org.apache.solr.handler.component.CustomHighlightComponentTest.setupCluster(CustomHighlightComponentTest.java:125)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:874)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.base/java.lang.Thread.run(Thread.java:834)


FAILED:  
junit.framework.TestSuite.org.apache.solr.handler.component.CustomHighlightComponentTest

Error Message:
Could not find collection:collection82

Stack Trace:
java.lang.AssertionError: Could not find collection:collection82
at __randomizedtesting.SeedInfo.seed([CBE5F5CCE049EC6E]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.assertTrue(Assert.java:43)
at org.junit.Assert.assertNotNull(Assert.java:526)
at 
org.apache.solr.cloud.AbstractDistribZkTestBase.waitForRecoveriesToFinish(AbstractDistribZkTestBase.java:155)
at 
org.apache.solr.handler.component.CustomHighlightComponentTest.setupCluster(CustomHighlightComponentTest.java:125)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:874)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 

[jira] [Commented] (SOLR-12519) Support Deeply Nested Docs In Child Documents Transformer

2018-08-23 Thread mosh (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12519?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590216#comment-16590216
 ] 

mosh commented on SOLR-12519:
-

{quote}
||Reason||Tests||
|Failed junit tests|solr.client.solrj.embedded.SolrExampleStreamingBinaryTest|
{quote}

I am having trouble finding the test seed, and without providing one this test 
seems to pass.
Forgive my lack of knowledge, where might I be able to find the test seed which 
was specifically used in the failed test run?

> Support Deeply Nested Docs In Child Documents Transformer
> -
>
> Key: SOLR-12519
> URL: https://issues.apache.org/jira/browse/SOLR-12519
> Project: Solr
>  Issue Type: Sub-task
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: mosh
>Priority: Major
> Attachments: SOLR-12519-fix-solrj-tests.patch, 
> SOLR-12519-no-commit.patch, SOLR-12519.patch
>
>  Time Spent: 23h 10m
>  Remaining Estimate: 0h
>
> As discussed in SOLR-12298, to make use of the meta-data fields in 
> SOLR-12441, there needs to be a smarter child document transformer, which 
> provides the ability to rebuild the original nested documents' structure.
>  In addition, I also propose the transformer will also have the ability to 
> bring only some of the original hierarchy, to prevent unnecessary block join 
> queries. e.g.
> {code}  {"a": "b", "c": [ {"e": "f"}, {"e": "g"} , {"h": "i"} ]} {code}
>  Incase my query is for all the children of "a:b", which contain the key "e" 
> in them, the query will be broken in to two parts:
>  1. The parent query "a:b"
>  2. The child query "e:*".
> If the only children flag is on, the transformer will return the following 
> documents:
>  {code}[ {"e": "f"}, {"e": "g"} ]{code}
> In case the flag was not turned on(perhaps the default state), the whole 
> document hierarchy will be returned, containing only the matching children:
> {code}{"a": "b", "c": [ {"e": "f"}, {"e": "g"} ]{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12590) Improve Solr resource loader coverage in the ref guide

2018-08-23 Thread Cassandra Targett (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12590?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590208#comment-16590208
 ] 

Cassandra Targett commented on SOLR-12590:
--

I took a look at the new patch - +1 overall. IMO it's ready to commit whenever 
you're ready. Thanks.

> Improve Solr resource loader coverage in the ref guide
> --
>
> Key: SOLR-12590
> URL: https://issues.apache.org/jira/browse/SOLR-12590
> Project: Solr
>  Issue Type: Task
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: documentation
>Reporter: Steve Rowe
>Assignee: Steve Rowe
>Priority: Major
> Attachments: SOLR-12590.patch, SOLR-12590.patch
>
>
> In SolrCloud, storing large resources (e.g. binary machine learned models) on 
> the local filesystem should be a viable alternative to increasing ZooKeeper's 
> max file size limit (1MB), but there are undocumented complications.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12519) Support Deeply Nested Docs In Child Documents Transformer

2018-08-23 Thread Lucene/Solr QA (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12519?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590156#comment-16590156
 ] 

Lucene/Solr QA commented on SOLR-12519:
---

| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
|| || || || {color:brown} Prechecks {color} ||
| {color:green}+1{color} | {color:green} test4tests {color} | {color:green}  0m 
 0s{color} | {color:green} The patch appears to include 22 new or modified test 
files. {color} |
|| || || || {color:brown} master Compile Tests {color} ||
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  2m 
16s{color} | {color:green} master passed {color} |
|| || || || {color:brown} Patch Compile Tests {color} ||
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  2m  
3s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green}  2m  
3s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} Release audit (RAT) {color} | 
{color:green}  0m 36s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} Check forbidden APIs {color} | 
{color:green}  0m 17s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} Validate source patterns {color} | 
{color:green}  0m 17s{color} | {color:green} the patch passed {color} |
|| || || || {color:brown} Other Tests {color} ||
| {color:green}+1{color} | {color:green} unit {color} | {color:green}  0m 
53s{color} | {color:green} facet in the patch passed. {color} |
| {color:red}-1{color} | {color:red} unit {color} | {color:red}  0m  5s{color} 
| {color:red} tools in the patch failed. {color} |
| {color:green}+1{color} | {color:green} unit {color} | {color:green}  0m 
46s{color} | {color:green} dataimporthandler in the patch passed. {color} |
| {color:green}+1{color} | {color:green} unit {color} | {color:green}  0m 
28s{color} | {color:green} dataimporthandler-extras in the patch passed. 
{color} |
| {color:green}+1{color} | {color:green} unit {color} | {color:green} 50m 
31s{color} | {color:green} core in the patch passed. {color} |
| {color:red}-1{color} | {color:red} unit {color} | {color:red}  2m 36s{color} 
| {color:red} solrj in the patch failed. {color} |
| {color:black}{color} | {color:black} {color} | {color:black} 62m 58s{color} | 
{color:black} {color} |
\\
\\
|| Reason || Tests ||
| Failed junit tests | 
solr.client.solrj.embedded.SolrExampleStreamingBinaryTest |
\\
\\
|| Subsystem || Report/Notes ||
| JIRA Issue | SOLR-12519 |
| JIRA Patch URL | 
https://issues.apache.org/jira/secure/attachment/12936767/SOLR-12519-fix-solrj-tests.patch
 |
| Optional Tests |  compile  javac  unit  ratsources  checkforbiddenapis  
validatesourcepatterns  |
| uname | Linux lucene1-us-west 4.4.0-130-generic #156~14.04.1-Ubuntu SMP Thu 
Jun 14 13:51:47 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | ant |
| Personality | 
/home/jenkins/jenkins-slave/workspace/PreCommit-SOLR-Build/sourcedir/dev-tools/test-patch/lucene-solr-yetus-personality.sh
 |
| git revision | master / 8cde127 |
| ant | version: Apache Ant(TM) version 1.9.3 compiled on July 24 2018 |
| Default Java | 1.8.0_172 |
| unit | 
https://builds.apache.org/job/PreCommit-SOLR-Build/170/artifact/out/patch-unit-lucene_tools.txt
 |
| unit | 
https://builds.apache.org/job/PreCommit-SOLR-Build/170/artifact/out/patch-unit-solr_solrj.txt
 |
|  Test Results | 
https://builds.apache.org/job/PreCommit-SOLR-Build/170/testReport/ |
| modules | C: lucene/facet lucene/tools solr solr/contrib/dataimporthandler 
solr/contrib/dataimporthandler-extras solr/core solr/solrj U: . |
| Console output | 
https://builds.apache.org/job/PreCommit-SOLR-Build/170/console |
| Powered by | Apache Yetus 0.7.0   http://yetus.apache.org |


This message was automatically generated.



> Support Deeply Nested Docs In Child Documents Transformer
> -
>
> Key: SOLR-12519
> URL: https://issues.apache.org/jira/browse/SOLR-12519
> Project: Solr
>  Issue Type: Sub-task
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: mosh
>Priority: Major
> Attachments: SOLR-12519-fix-solrj-tests.patch, 
> SOLR-12519-no-commit.patch, SOLR-12519.patch
>
>  Time Spent: 23h 10m
>  Remaining Estimate: 0h
>
> As discussed in SOLR-12298, to make use of the meta-data fields in 
> SOLR-12441, there needs to be a smarter child document transformer, which 
> provides the ability to rebuild the original nested documents' structure.
>  In addition, I also propose the transformer will also have the ability to 
> bring only some of the original hierarchy, to prevent unnecessary block join 
> queries. e.g.
> {code}  {"a": "b", "c": [ {"e": 

[JENKINS-EA] Lucene-Solr-master-Linux (64bit/jdk-11-ea+25) - Build # 22729 - Unstable!

2018-08-23 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/22729/
Java: 64bit/jdk-11-ea+25 -XX:-UseCompressedOops -XX:+UseSerialGC

13 tests failed.
FAILED:  
org.apache.solr.cloud.api.collections.ShardSplitTest.testSplitMixedReplicaTypesLink

Error Message:
unexpected shard state expected: but was:

Stack Trace:
java.lang.AssertionError: unexpected shard state expected: but 
was:
at 
__randomizedtesting.SeedInfo.seed([2FBBAEF51BA1:13D567ACBE796755]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.failNotEquals(Assert.java:647)
at org.junit.Assert.assertEquals(Assert.java:128)
at 
org.apache.solr.cloud.api.collections.ShardSplitTest.verifyShard(ShardSplitTest.java:372)
at 
org.apache.solr.cloud.api.collections.ShardSplitTest.doSplitMixedReplicaTypes(ShardSplitTest.java:364)
at 
org.apache.solr.cloud.api.collections.ShardSplitTest.testSplitMixedReplicaTypesLink(ShardSplitTest.java:336)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1008)
at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:983)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 

[JENKINS] Lucene-Solr-Tests-7.x - Build # 811 - Unstable

2018-08-23 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-Tests-7.x/811/

1 tests failed.
FAILED:  
org.apache.solr.cloud.autoscaling.IndexSizeTriggerTest.testSplitIntegration

Error Message:
events: [CapturedEvent{timestamp=17493394761634003, stage=STARTED, 
actionName='null', event={   "id":"3e262462cc9a01T9g9kp5o4ft6co36vge365dc49",   
"source":"index_size_trigger",   "eventTime":17493386274314753,   
"eventType":"INDEXSIZE",   "properties":{ "__start__":1, 
"aboveSize":{"testSplitIntegration_collection":["{\"core_node1\":{\n
\"core\":\"testSplitIntegration_collection_shard1_replica_n1\",\n
\"shard\":\"shard1\",\n
\"collection\":\"testSplitIntegration_collection\",\n
\"node_name\":\"127.0.0.1:10007_solr\",\n\"type\":\"NRT\",\n
\"leader\":\"true\",\n\"SEARCHER.searcher.maxDoc\":14,\n
\"SEARCHER.searcher.deletedDocs\":0,\n\"INDEX.sizeInBytes\":17240,\n
\"SEARCHER.searcher.numDocs\":14,\n\"__bytes__\":17240,\n
\"__docs__\":14,\n\"violationType\":\"aboveDocs\",\n
\"state\":\"active\",\n\"INDEX.sizeInGB\":1.605600118637085E-5}}"]}, 
"belowSize":{}, "_enqueue_time_":17493392418913303, 
"requestedOps":["Op{action=SPLITSHARD, hints={COLL_SHARD=[{\n  
\"first\":\"testSplitIntegration_collection\",\n  
\"second\":\"shard1\"}]}}"]}}, context={}, config={   
"trigger":"index_size_trigger",   "stage":[ "STARTED", "ABORTED", 
"SUCCEEDED", "FAILED"],   "beforeAction":[ "compute_plan", 
"execute_plan"],   "afterAction":[ "compute_plan", "execute_plan"],   
"class":"org.apache.solr.cloud.autoscaling.IndexSizeTriggerTest$CapturingTriggerListener"},
 message='null'}, CapturedEvent{timestamp=17493395423000103, 
stage=BEFORE_ACTION, actionName='compute_plan', event={   
"id":"3e262462cc9a01T9g9kp5o4ft6co36vge365dc49",   
"source":"index_size_trigger",   "eventTime":17493386274314753,   
"eventType":"INDEXSIZE",   "properties":{ "__start__":1, 
"aboveSize":{"testSplitIntegration_collection":["{\"core_node1\":{\n
\"core\":\"testSplitIntegration_collection_shard1_replica_n1\",\n
\"shard\":\"shard1\",\n
\"collection\":\"testSplitIntegration_collection\",\n
\"node_name\":\"127.0.0.1:10007_solr\",\n\"type\":\"NRT\",\n
\"leader\":\"true\",\n\"SEARCHER.searcher.maxDoc\":14,\n
\"SEARCHER.searcher.deletedDocs\":0,\n\"INDEX.sizeInBytes\":17240,\n
\"SEARCHER.searcher.numDocs\":14,\n\"__bytes__\":17240,\n
\"__docs__\":14,\n\"violationType\":\"aboveDocs\",\n
\"state\":\"active\",\n\"INDEX.sizeInGB\":1.605600118637085E-5}}"]}, 
"belowSize":{}, "_enqueue_time_":17493392418913303, 
"requestedOps":["Op{action=SPLITSHARD, hints={COLL_SHARD=[{\n  
\"first\":\"testSplitIntegration_collection\",\n  
\"second\":\"shard1\"}]}}"]}}, 
context={properties.BEFORE_ACTION=[compute_plan], source=index_size_trigger}, 
config={   "trigger":"index_size_trigger",   "stage":[ "STARTED", 
"ABORTED", "SUCCEEDED", "FAILED"],   "beforeAction":[ 
"compute_plan", "execute_plan"],   "afterAction":[ "compute_plan", 
"execute_plan"],   
"class":"org.apache.solr.cloud.autoscaling.IndexSizeTriggerTest$CapturingTriggerListener"},
 message='null'}, CapturedEvent{timestamp=17493395524960753, 
stage=AFTER_ACTION, actionName='compute_plan', event={   
"id":"3e262462cc9a01T9g9kp5o4ft6co36vge365dc49",   
"source":"index_size_trigger",   "eventTime":17493386274314753,   
"eventType":"INDEXSIZE",   "properties":{ "__start__":1, 
"aboveSize":{"testSplitIntegration_collection":["{\"core_node1\":{\n
\"core\":\"testSplitIntegration_collection_shard1_replica_n1\",\n
\"shard\":\"shard1\",\n
\"collection\":\"testSplitIntegration_collection\",\n
\"node_name\":\"127.0.0.1:10007_solr\",\n\"type\":\"NRT\",\n
\"leader\":\"true\",\n\"SEARCHER.searcher.maxDoc\":14,\n
\"SEARCHER.searcher.deletedDocs\":0,\n\"INDEX.sizeInBytes\":17240,\n
\"SEARCHER.searcher.numDocs\":14,\n\"__bytes__\":17240,\n
\"__docs__\":14,\n\"violationType\":\"aboveDocs\",\n
\"state\":\"active\",\n\"INDEX.sizeInGB\":1.605600118637085E-5}}"]}, 
"belowSize":{}, "_enqueue_time_":17493392418913303, 
"requestedOps":["Op{action=SPLITSHARD, hints={COLL_SHARD=[{\n  
\"first\":\"testSplitIntegration_collection\",\n  
\"second\":\"shard1\"}]}}"]}}, 
context={properties.operations=[{class=org.apache.solr.client.solrj.request.CollectionAdminRequest$SplitShard,
 method=GET, params.action=SPLITSHARD, 
params.collection=testSplitIntegration_collection, params.shard=shard1}], 
properties.BEFORE_ACTION=[compute_plan], source=index_size_trigger, 
properties.AFTER_ACTION=[compute_plan]}, config={   
"trigger":"index_size_trigger",   "stage":[ "STARTED", "ABORTED", 
"SUCCEEDED", "FAILED"],   "beforeAction":[ "compute_plan", 
"execute_plan"],   "afterAction":[ "compute_plan", "execute_plan"],   

[JENKINS] Lucene-Solr-7.x-Linux (32bit/jdk1.8.0_172) - Build # 2614 - Still Unstable!

2018-08-23 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.x-Linux/2614/
Java: 32bit/jdk1.8.0_172 -server -XX:+UseSerialGC

1 tests failed.
FAILED:  org.apache.solr.cloud.MoveReplicaHDFSTest.testFailedMove

Error Message:
No live SolrServers available to handle this 
request:[https://127.0.0.1:42079/solr/MoveReplicaHDFSTest_failed_coll_true, 
https://127.0.0.1:35745/solr/MoveReplicaHDFSTest_failed_coll_true]

Stack Trace:
org.apache.solr.client.solrj.SolrServerException: No live SolrServers available 
to handle this 
request:[https://127.0.0.1:42079/solr/MoveReplicaHDFSTest_failed_coll_true, 
https://127.0.0.1:35745/solr/MoveReplicaHDFSTest_failed_coll_true]
at 
__randomizedtesting.SeedInfo.seed([81460426B93DD857:2B8BD7D40EEE0D87]:0)
at 
org.apache.solr.client.solrj.impl.LBHttpSolrClient.request(LBHttpSolrClient.java:462)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1109)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:886)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:996)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:819)
at 
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:194)
at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:942)
at 
org.apache.solr.cloud.MoveReplicaTest.testFailedMove(MoveReplicaTest.java:289)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 

[jira] [Commented] (SOLR-12572) Reuse fieldvalues computed while sorting at writing in ExportWriter

2018-08-23 Thread Amrit Sarkar (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12572?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16590013#comment-16590013
 ] 

Amrit Sarkar commented on SOLR-12572:
-

I did some benchmarkings comparing export perf b/w version {{7.4}}, {{master}} 
branch, {{master + S-12572 patch}}: 
https://docs.google.com/spreadsheets/d/1ZBCOybUPr9UVG0GCAJQTPYnKhohKirIlXduynFf2Whs/edit?usp=sharing.

We can see decent performance improvements, which will add performance to 
streaming expressions as well.

> Reuse fieldvalues computed while sorting at writing in ExportWriter
> ---
>
> Key: SOLR-12572
> URL: https://issues.apache.org/jira/browse/SOLR-12572
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: streaming expressions
>Reporter: Amrit Sarkar
>Assignee: Varun Thacker
>Priority: Minor
> Attachments: SOLR-12572.patch, SOLR-12572.patch, SOLR-12572.patch, 
> SOLR-12572.patch, SOLR-12572.patch, SOLR-12572.patch, SOLR-12572.patch
>
>
> While exporting result through "/export" handler,
> {code:java}
> http://localhost:8983/solr/core_name/export?q=my-query=severity+desc,timestamp+desc=severity,timestamp,msg
> {code}
> Doc-values are sought for all the {{sort}} fields defined (in this example 
> 'severity, 'timestamp'). When we stream out docs we again make doc-value 
> seeks against the {{fl}} fields ('severity','timestamp','msg') . 
> In most common use-cases we have {{fl = sort}} fields, or atleast the sort 
> fields are subset of {{fl}} fields, so if we can *pre-collect* the values 
> while sorting it, we can reduce the doc-value seeks potentially bringing 
> *speed improvement*.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Created] (LUCENE-8464) Implement ConstantScoreScorer#setMinCompetitiveScore

2018-08-23 Thread Adrien Grand (JIRA)
Adrien Grand created LUCENE-8464:


 Summary: Implement ConstantScoreScorer#setMinCompetitiveScore
 Key: LUCENE-8464
 URL: https://issues.apache.org/jira/browse/LUCENE-8464
 Project: Lucene - Core
  Issue Type: Improvement
Reporter: Adrien Grand


We should make it so the iterator returns NO_MORE_DOCS after 
setMinCompetitiveScore is called with a value that is greater than the constant 
score.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Created] (LUCENE-8463) Early-terminate queries sorted by SortField.DOC

2018-08-23 Thread Adrien Grand (JIRA)
Adrien Grand created LUCENE-8463:


 Summary: Early-terminate queries sorted by SortField.DOC
 Key: LUCENE-8463
 URL: https://issues.apache.org/jira/browse/LUCENE-8463
 Project: Lucene - Core
  Issue Type: Improvement
Reporter: Adrien Grand


Currently TopFieldCollector only early-terminates when the search sort is a 
prefix of the index sort, but it could also early-terminate when sorting by doc 
id.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] lucene-solr pull request #440: Lucene 6989 v2

2018-08-23 Thread firewolf2010
Github user firewolf2010 closed the pull request at:

https://github.com/apache/lucene-solr/pull/440


---

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] lucene-solr pull request #440: Lucene 6989 v2

2018-08-23 Thread firewolf2010
GitHub user firewolf2010 opened a pull request:

https://github.com/apache/lucene-solr/pull/440

Lucene 6989 v2



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/apache/lucene-solr LUCENE-6989-v2

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/lucene-solr/pull/440.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #440


commit 8a2c71d747456bbef30775f8c966e4c02c48f5bc
Author: Uwe Schindler 
Date:   2016-12-14T17:52:38Z

LUCENE-6989: Unmapping byte buffers: Preview version of the Java 9 b148++ 
patch

commit 71d3a5039dbcca6066cc7f6e3ac2f1564afd12b0
Author: Uwe Schindler 
Date:   2016-12-14T19:17:23Z

LUCENE-6989: Fix annotation of hack

commit b8fe1ff83fa1243172b2812bd976a6011f671914
Author: Uwe Schindler 
Date:   2016-12-14T19:22:42Z

LUCENE-6989: Rename variable

commit ffc957fdb3c21d110ab23392ed91e74cfc1f169d
Author: Uwe Schindler 
Date:   2016-12-16T21:09:54Z

LUCENE-6989: Refactor code and add documentation

commit 64c6f359949b62fe981255516ba2286c0adcc190
Author: Uwe Schindler 
Date:   2016-12-16T21:38:30Z

LUCENE-6989: Comments and final cleanup




---

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12692) Add hints/warnings for the ZK Status Admin UI

2018-08-23 Thread JIRA


[ 
https://issues.apache.org/jira/browse/SOLR-12692?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16589887#comment-16589887
 ] 

Jan Høydahl commented on SOLR-12692:


All good suggestions. I don't know if you played around with the other 
warnings, but you can test e.g. by specifying only two of the zk's in {{-z}} 
when starting Solr - you will then see complaints about quorum size, mismatches 
etc. Guess you could extend those checks to also look at other values as you 
suggest.

> Add hints/warnings for the ZK Status Admin UI
> -
>
> Key: SOLR-12692
> URL: https://issues.apache.org/jira/browse/SOLR-12692
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Admin UI
>Reporter: Varun Thacker
>Priority: Minor
> Attachments: zk_ensemble.png
>
>
> Firstly I love the new UI pages ( ZK Status and Nodes ) . Thanks [~janhoy] 
> for all the great work!
> I setup a 3 node ZK ensemble to play around with the UI and attaching the 
> screenshot for reference.
>  
> Here are a few suggestions I had
>  # Let’s show Approximate Size in human readable form.  We can use 
> RamUsageEstimator#humanReadableUnits to calculate it
>  # Show warning symbol when Ensemble is standalone
>  # If maxSessionTimeout < Solr's ZK_CLIENT_TIMEOUT then ZK will only honor 
> up-to the maxSessionTimeout value for the Solr->ZK connection. We could mark 
> that as a warning.
>  # If maxClientCnxns < live_nodes show this as a red? Each solr node connects 
> to all zk nodes so if the number of nodes in the cluster is high one should 
> also be increasing maxClientCnxns
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-NightlyTests-7.x - Build # 299 - Unstable

2018-08-23 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-NightlyTests-7.x/299/

51 tests failed.
FAILED:  
org.apache.solr.cloud.autoscaling.SearchRateTriggerIntegrationTest.testBelowSearchRate

Error Message:
[Op{action=DELETEREPLICA, hints={COLL_SHARD=[{   
"first":"belowRate_collection",   "second":"shard1"}], REPLICA=[core_node4]}}, 
Op{action=DELETEREPLICA, hints={COLL_SHARD=[{   "first":"belowRate_collection", 
  "second":"shard1"}], REPLICA=[core_node6]}}, Op{action=DELETEREPLICA, 
hints={COLL_SHARD=[{   "first":"belowRate_collection",   "second":"shard1"}], 
REPLICA=[core_node10]}}, Op{action=NONE, 
hints={SRC_NODE=[127.0.0.1:38695_solr]}}, Op{action=NONE, 
hints={SRC_NODE=[127.0.0.1:38483_solr]}}, Op{action=NONE, 
hints={SRC_NODE=[127.0.0.1:38297_solr]}}, Op{action=NONE, 
hints={SRC_NODE=[127.0.0.1:40322_solr]}}, Op{action=NONE, 
hints={SRC_NODE=[127.0.0.1:34750_solr]}}] expected:<7> but was:<8>

Stack Trace:
java.lang.AssertionError: [Op{action=DELETEREPLICA, hints={COLL_SHARD=[{
  "first":"belowRate_collection",
  "second":"shard1"}], REPLICA=[core_node4]}}, Op{action=DELETEREPLICA, 
hints={COLL_SHARD=[{
  "first":"belowRate_collection",
  "second":"shard1"}], REPLICA=[core_node6]}}, Op{action=DELETEREPLICA, 
hints={COLL_SHARD=[{
  "first":"belowRate_collection",
  "second":"shard1"}], REPLICA=[core_node10]}}, Op{action=NONE, 
hints={SRC_NODE=[127.0.0.1:38695_solr]}}, Op{action=NONE, 
hints={SRC_NODE=[127.0.0.1:38483_solr]}}, Op{action=NONE, 
hints={SRC_NODE=[127.0.0.1:38297_solr]}}, Op{action=NONE, 
hints={SRC_NODE=[127.0.0.1:40322_solr]}}, Op{action=NONE, 
hints={SRC_NODE=[127.0.0.1:34750_solr]}}] expected:<7> but was:<8>
at 
__randomizedtesting.SeedInfo.seed([52D521B628CD1D84:387887D670259897]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.failNotEquals(Assert.java:647)
at org.junit.Assert.assertEquals(Assert.java:128)
at org.junit.Assert.assertEquals(Assert.java:472)
at 
org.apache.solr.cloud.autoscaling.SearchRateTriggerIntegrationTest.testBelowSearchRate(SearchRateTriggerIntegrationTest.java:398)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 

[jira] [Commented] (SOLR-12685) RTG should return the whole block if schema is nested

2018-08-23 Thread mosh (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12685?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16589802#comment-16589802
 ] 

mosh commented on SOLR-12685:
-

{quote}in RTG we need the realtime searcher{quote}
Oh right I guess I sort of missed that part...
{quote} I'm also dubious you needed to change the method signature in RTG to 
take SolrQueryRequest; we'll see.{quote}
I guess if there is no need for req#getSearcher, I do not foresee a need to 
pass req as a parameter.

> RTG should return the whole block if schema is nested
> -
>
> Key: SOLR-12685
> URL: https://issues.apache.org/jira/browse/SOLR-12685
> Project: Solr
>  Issue Type: Sub-task
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: mosh
>Priority: Major
> Attachments: SOLR-12638-no-commit.patch
>
>
> Currently Solr's RealTimeGet component return the document if provided a 
> docId when consulting the index. For AtomicUpdates for child documents, RTG 
> should return the whole block when dealing with a nested schema.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-12685) RTG should return the whole block if schema is nested

2018-08-23 Thread mosh (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12685?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16589802#comment-16589802
 ] 

mosh edited comment on SOLR-12685 at 8/23/18 7:04 AM:
--

{quote}in RTG we need the realtime searcher
{quote}
Oh right I guess I sort of missed that part...
{quote}I'm also dubious you needed to change the method signature in RTG to 
take SolrQueryRequest; we'll see.
{quote}
I guess if there is no need for req.getSearcher(), I do not foresee a need to 
pass req as a parameter.


was (Author: moshebla):
{quote}in RTG we need the realtime searcher{quote}
Oh right I guess I sort of missed that part...
{quote} I'm also dubious you needed to change the method signature in RTG to 
take SolrQueryRequest; we'll see.{quote}
I guess if there is no need for req#getSearcher, I do not foresee a need to 
pass req as a parameter.

> RTG should return the whole block if schema is nested
> -
>
> Key: SOLR-12685
> URL: https://issues.apache.org/jira/browse/SOLR-12685
> Project: Solr
>  Issue Type: Sub-task
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: mosh
>Priority: Major
> Attachments: SOLR-12638-no-commit.patch
>
>
> Currently Solr's RealTimeGet component return the document if provided a 
> docId when consulting the index. For AtomicUpdates for child documents, RTG 
> should return the whole block when dealing with a nested schema.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-7.x-Linux (64bit/jdk-10) - Build # 2613 - Unstable!

2018-08-23 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.x-Linux/2613/
Java: 64bit/jdk-10 -XX:+UseCompressedOops -XX:+UseG1GC

1 tests failed.
FAILED:  junit.framework.TestSuite.org.apache.solr.cloud.LeaderTragicEventTest

Error Message:
ObjectTracker found 1 object(s) that were not released!!! 
[MockDirectoryWrapper] 
org.apache.solr.common.util.ObjectReleaseTracker$ObjectTrackerException: 
org.apache.lucene.store.MockDirectoryWrapper  at 
org.apache.solr.common.util.ObjectReleaseTracker.track(ObjectReleaseTracker.java:42)
  at 
org.apache.solr.core.CachingDirectoryFactory.get(CachingDirectoryFactory.java:348)
  at org.apache.solr.update.SolrIndexWriter.create(SolrIndexWriter.java:95)  at 
org.apache.solr.core.SolrCore.initIndex(SolrCore.java:768)  at 
org.apache.solr.core.SolrCore.(SolrCore.java:960)  at 
org.apache.solr.core.SolrCore.(SolrCore.java:869)  at 
org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1138)
  at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1048)  at 
org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$0(CoreAdminOperation.java:92)
  at 
org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:360)
  at 
org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:395)
  at 
org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:180)
  at 
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:199)
  at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:734)  
at 
org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:715)  
at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:496)  at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:377)
  at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:323)
  at 
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642)
  at 
org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:139)
  at 
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642)
  at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:533) 
 at 
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
  at 
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1595)
  at 
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
  at 
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1317)
  at 
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
  at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:473)  
at 
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1564)
  at 
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
  at 
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1219)
  at 
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)  
at 
org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:674)  
at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132) 
 at org.eclipse.jetty.server.Server.handle(Server.java:531)  at 
org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:352)  at 
org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:260)  at 
org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:281)
  at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:102)  at 
org.eclipse.jetty.io.ssl.SslConnection.onFillable(SslConnection.java:291)  at 
org.eclipse.jetty.io.ssl.SslConnection$3.succeeded(SslConnection.java:151)  at 
org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:102)  at 
org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:118)  at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
  at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
  at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
  at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
  at 
org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
  at 
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:762)
  at 
org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:680) 
 at java.base/java.lang.Thread.run(Thread.java:844)  

Stack Trace:
java.lang.AssertionError: ObjectTracker found 1 object(s) that were not 
released!!! [MockDirectoryWrapper]
org.apache.solr.common.util.ObjectReleaseTracker$ObjectTrackerException: 
org.apache.lucene.store.MockDirectoryWrapper
at 

[jira] [Comment Edited] (SOLR-12519) Support Deeply Nested Docs In Child Documents Transformer

2018-08-23 Thread mosh (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12519?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16589744#comment-16589744
 ] 

mosh edited comment on SOLR-12519 at 8/23/18 6:19 AM:
--

{quote}Can you investigate [~moshebla]?
{quote}
After further investigation it seems like those tests were failing because 
there was no check whether no children matched the childFilter or whether the 
parent matched the parentsFilter, causing an assertionError. I added a couple 
more conditions to fix this in the patch file 
[SOLR-12519-fix-solrj-tests.patch|https://issues.apache.org/jira/secure/attachment/12936767/SOLR-12519-fix-solrj-tests.patch],
 which I just uploaded.


was (Author: moshebla):
{quote}Can you investigate [~moshebla]?{quote}

After further investigation it seems like those tests were failing because 
there was no check whether no children matched the childFilter or whether the 
parent matched the parentFilter, causing an assertionError. I added a couple 
more conditions to fix this in the new patch I uploaded.

> Support Deeply Nested Docs In Child Documents Transformer
> -
>
> Key: SOLR-12519
> URL: https://issues.apache.org/jira/browse/SOLR-12519
> Project: Solr
>  Issue Type: Sub-task
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: mosh
>Priority: Major
> Attachments: SOLR-12519-fix-solrj-tests.patch, 
> SOLR-12519-no-commit.patch, SOLR-12519.patch
>
>  Time Spent: 23h 10m
>  Remaining Estimate: 0h
>
> As discussed in SOLR-12298, to make use of the meta-data fields in 
> SOLR-12441, there needs to be a smarter child document transformer, which 
> provides the ability to rebuild the original nested documents' structure.
>  In addition, I also propose the transformer will also have the ability to 
> bring only some of the original hierarchy, to prevent unnecessary block join 
> queries. e.g.
> {code}  {"a": "b", "c": [ {"e": "f"}, {"e": "g"} , {"h": "i"} ]} {code}
>  Incase my query is for all the children of "a:b", which contain the key "e" 
> in them, the query will be broken in to two parts:
>  1. The parent query "a:b"
>  2. The child query "e:*".
> If the only children flag is on, the transformer will return the following 
> documents:
>  {code}[ {"e": "f"}, {"e": "g"} ]{code}
> In case the flag was not turned on(perhaps the default state), the whole 
> document hierarchy will be returned, containing only the matching children:
> {code}{"a": "b", "c": [ {"e": "f"}, {"e": "g"} ]{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-12519) Support Deeply Nested Docs In Child Documents Transformer

2018-08-23 Thread mosh (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12519?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16589744#comment-16589744
 ] 

mosh edited comment on SOLR-12519 at 8/23/18 6:19 AM:
--

{quote}Can you investigate [~moshebla]?
{quote}
After further investigation it seems like those tests were failing because 
there was no check whether no children matched the childFilter or whether the 
parent matched the parentsFilter, causing an assertionError.
I added a couple more conditions to fix this in the patch file 
[SOLR-12519-fix-solrj-tests.patch|https://issues.apache.org/jira/secure/attachment/12936767/SOLR-12519-fix-solrj-tests.patch],
 which I just uploaded.


was (Author: moshebla):
{quote}Can you investigate [~moshebla]?
{quote}
After further investigation it seems like those tests were failing because 
there was no check whether no children matched the childFilter or whether the 
parent matched the parentsFilter, causing an assertionError. I added a couple 
more conditions to fix this in the patch file 
[SOLR-12519-fix-solrj-tests.patch|https://issues.apache.org/jira/secure/attachment/12936767/SOLR-12519-fix-solrj-tests.patch],
 which I just uploaded.

> Support Deeply Nested Docs In Child Documents Transformer
> -
>
> Key: SOLR-12519
> URL: https://issues.apache.org/jira/browse/SOLR-12519
> Project: Solr
>  Issue Type: Sub-task
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: mosh
>Priority: Major
> Attachments: SOLR-12519-fix-solrj-tests.patch, 
> SOLR-12519-no-commit.patch, SOLR-12519.patch
>
>  Time Spent: 23h 10m
>  Remaining Estimate: 0h
>
> As discussed in SOLR-12298, to make use of the meta-data fields in 
> SOLR-12441, there needs to be a smarter child document transformer, which 
> provides the ability to rebuild the original nested documents' structure.
>  In addition, I also propose the transformer will also have the ability to 
> bring only some of the original hierarchy, to prevent unnecessary block join 
> queries. e.g.
> {code}  {"a": "b", "c": [ {"e": "f"}, {"e": "g"} , {"h": "i"} ]} {code}
>  Incase my query is for all the children of "a:b", which contain the key "e" 
> in them, the query will be broken in to two parts:
>  1. The parent query "a:b"
>  2. The child query "e:*".
> If the only children flag is on, the transformer will return the following 
> documents:
>  {code}[ {"e": "f"}, {"e": "g"} ]{code}
> In case the flag was not turned on(perhaps the default state), the whole 
> document hierarchy will be returned, containing only the matching children:
> {code}{"a": "b", "c": [ {"e": "f"}, {"e": "g"} ]{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-12519) Support Deeply Nested Docs In Child Documents Transformer

2018-08-23 Thread mosh (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-12519?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

mosh updated SOLR-12519:

Attachment: SOLR-12519-fix-solrj-tests.patch

> Support Deeply Nested Docs In Child Documents Transformer
> -
>
> Key: SOLR-12519
> URL: https://issues.apache.org/jira/browse/SOLR-12519
> Project: Solr
>  Issue Type: Sub-task
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: mosh
>Priority: Major
> Attachments: SOLR-12519-fix-solrj-tests.patch, 
> SOLR-12519-no-commit.patch, SOLR-12519.patch
>
>  Time Spent: 23h 10m
>  Remaining Estimate: 0h
>
> As discussed in SOLR-12298, to make use of the meta-data fields in 
> SOLR-12441, there needs to be a smarter child document transformer, which 
> provides the ability to rebuild the original nested documents' structure.
>  In addition, I also propose the transformer will also have the ability to 
> bring only some of the original hierarchy, to prevent unnecessary block join 
> queries. e.g.
> {code}  {"a": "b", "c": [ {"e": "f"}, {"e": "g"} , {"h": "i"} ]} {code}
>  Incase my query is for all the children of "a:b", which contain the key "e" 
> in them, the query will be broken in to two parts:
>  1. The parent query "a:b"
>  2. The child query "e:*".
> If the only children flag is on, the transformer will return the following 
> documents:
>  {code}[ {"e": "f"}, {"e": "g"} ]{code}
> In case the flag was not turned on(perhaps the default state), the whole 
> document hierarchy will be returned, containing only the matching children:
> {code}{"a": "b", "c": [ {"e": "f"}, {"e": "g"} ]{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12519) Support Deeply Nested Docs In Child Documents Transformer

2018-08-23 Thread mosh (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-12519?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16589744#comment-16589744
 ] 

mosh commented on SOLR-12519:
-

{quote}Can you investigate [~moshebla]?{quote}

After further investigation it seems like those tests were failing because 
there was no check whether no children matched the childFilter or whether the 
parent matched the parentFilter, causing an assertionError. I added a couple 
more conditions to fix this in the new patch I uploaded.

> Support Deeply Nested Docs In Child Documents Transformer
> -
>
> Key: SOLR-12519
> URL: https://issues.apache.org/jira/browse/SOLR-12519
> Project: Solr
>  Issue Type: Sub-task
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: mosh
>Priority: Major
> Attachments: SOLR-12519-no-commit.patch, SOLR-12519.patch
>
>  Time Spent: 23h 10m
>  Remaining Estimate: 0h
>
> As discussed in SOLR-12298, to make use of the meta-data fields in 
> SOLR-12441, there needs to be a smarter child document transformer, which 
> provides the ability to rebuild the original nested documents' structure.
>  In addition, I also propose the transformer will also have the ability to 
> bring only some of the original hierarchy, to prevent unnecessary block join 
> queries. e.g.
> {code}  {"a": "b", "c": [ {"e": "f"}, {"e": "g"} , {"h": "i"} ]} {code}
>  Incase my query is for all the children of "a:b", which contain the key "e" 
> in them, the query will be broken in to two parts:
>  1. The parent query "a:b"
>  2. The child query "e:*".
> If the only children flag is on, the transformer will return the following 
> documents:
>  {code}[ {"e": "f"}, {"e": "g"} ]{code}
> In case the flag was not turned on(perhaps the default state), the whole 
> document hierarchy will be returned, containing only the matching children:
> {code}{"a": "b", "c": [ {"e": "f"}, {"e": "g"} ]{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-Tests-master - Build # 2734 - Unstable

2018-08-23 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-Tests-master/2734/

2 tests failed.
FAILED:  
org.apache.solr.cloud.TestSkipOverseerOperations.testSkipLeaderOperations

Error Message:
Expected 2x1 for collection: collection1 null Live Nodes: 
[127.0.0.1:37905_solr, 127.0.0.1:41497_solr, 127.0.0.1:43536_solr] Last 
available state: null

Stack Trace:
java.lang.AssertionError: Expected 2x1 for collection: collection1
null
Live Nodes: [127.0.0.1:37905_solr, 127.0.0.1:41497_solr, 127.0.0.1:43536_solr]
Last available state: null
at 
__randomizedtesting.SeedInfo.seed([BEDABE3BE9073990:4E306DC7362449FA]:0)
at org.junit.Assert.fail(Assert.java:93)
at 
org.apache.solr.cloud.SolrCloudTestCase.waitForState(SolrCloudTestCase.java:280)
at 
org.apache.solr.cloud.TestSkipOverseerOperations.testSkipLeaderOperations(TestSkipOverseerOperations.java:69)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.lang.Thread.run(Thread.java:748)


FAILED: