[JENKINS-EA] Lucene-Solr-master-Linux (64bit/jdk-12-ea+12) - Build # 23278 - Still Unstable!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/23278/ Java: 64bit/jdk-12-ea+12 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC 118 tests failed. FAILED: junit.framework.TestSuite.org.apache.solr.analytics.facet.QueryFacetTest Error Message: ObjectTracker found 8 object(s) that were not released!!! [ZkStateReader, SolrZkClient, ZkStateReader, ZkStateReader, ZkStateReader, SolrZkClient, SolrZkClient, SolrZkClient] org.apache.solr.common.util.ObjectReleaseTracker$ObjectTrackerException: org.apache.solr.common.cloud.ZkStateReader at org.apache.solr.common.util.ObjectReleaseTracker.track(ObjectReleaseTracker.java:42) at org.apache.solr.common.cloud.ZkStateReader.(ZkStateReader.java:328) at org.apache.solr.client.solrj.impl.ZkClientClusterStateProvider.connect(ZkClientClusterStateProvider.java:160) at org.apache.solr.client.solrj.impl.CloudSolrClient.getZkStateReader(CloudSolrClient.java:342) at org.apache.solr.client.solrj.impl.SolrClientCloudManager.(SolrClientCloudManager.java:69) at org.apache.solr.cloud.ZkController.getSolrCloudManager(ZkController.java:708) at org.apache.solr.core.CoreContainer.createMetricsHistoryHandler(CoreContainer.java:765) at org.apache.solr.core.CoreContainer.load(CoreContainer.java:586) at org.apache.solr.servlet.SolrDispatchFilter.createCoreContainer(SolrDispatchFilter.java:253) at org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:173) at org.eclipse.jetty.servlet.FilterHolder.initialize(FilterHolder.java:139) at org.eclipse.jetty.servlet.ServletHandler.initialize(ServletHandler.java:741) at org.eclipse.jetty.servlet.ServletHandler.updateMappings(ServletHandler.java:1477) at org.eclipse.jetty.servlet.ServletHandler.setFilterMappings(ServletHandler.java:1542) at org.eclipse.jetty.servlet.ServletHandler.addFilterMapping(ServletHandler.java:1186) at org.eclipse.jetty.servlet.ServletHandler.addFilterWithMapping(ServletHandler.java:1023) at org.eclipse.jetty.servlet.ServletContextHandler.addFilter(ServletContextHandler.java:473) at org.apache.solr.client.solrj.embedded.JettySolrRunner$1.lifeCycleStarted(JettySolrRunner.java:344) at org.eclipse.jetty.util.component.AbstractLifeCycle.setStarted(AbstractLifeCycle.java:179) at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:69) at org.apache.solr.client.solrj.embedded.JettySolrRunner.retryOnPortBindFailure(JettySolrRunner.java:507) at org.apache.solr.client.solrj.embedded.JettySolrRunner.start(JettySolrRunner.java:446) at org.apache.solr.client.solrj.embedded.JettySolrRunner.start(JettySolrRunner.java:414) at org.apache.solr.cloud.MiniSolrCloudCluster.startJettySolrRunner(MiniSolrCloudCluster.java:443) at org.apache.solr.cloud.MiniSolrCloudCluster.lambda$new$0(MiniSolrCloudCluster.java:272) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:209) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:835) org.apache.solr.common.util.ObjectReleaseTracker$ObjectTrackerException: org.apache.solr.common.cloud.SolrZkClient at org.apache.solr.common.util.ObjectReleaseTracker.track(ObjectReleaseTracker.java:42) at org.apache.solr.common.cloud.SolrZkClient.(SolrZkClient.java:203) at org.apache.solr.common.cloud.SolrZkClient.(SolrZkClient.java:126) at org.apache.solr.common.cloud.SolrZkClient.(SolrZkClient.java:116) at org.apache.solr.common.cloud.ZkStateReader.(ZkStateReader.java:306) at org.apache.solr.client.solrj.impl.ZkClientClusterStateProvider.connect(ZkClientClusterStateProvider.java:160) at org.apache.solr.client.solrj.impl.CloudSolrClient.getZkStateReader(CloudSolrClient.java:342) at org.apache.solr.client.solrj.impl.SolrClientCloudManager.(SolrClientCloudManager.java:69) at org.apache.solr.cloud.ZkController.getSolrCloudManager(ZkController.java:708) at org.apache.solr.cloud.Overseer.getSolrCloudManager(Overseer.java:590) at org.apache.solr.cloud.api.collections.OverseerCollectionMessageHandler.(OverseerCollectionMessageHandler.java:238) at org.apache.solr.cloud.OverseerCollectionConfigSetProcessor.getOverseerMessageHandlerSelector(OverseerCollectionConfigSetProcessor.java:87) at org.apache.solr.cloud.OverseerCollectionConfigSetProcessor.(OverseerCollectionConfigSetProcessor.java:70) at org.apache.solr.cloud.OverseerCollectionConfigSetProcessor.(OverseerCollectionConfigSetProcessor.java:41) at org.apache.solr.cloud.Overseer.start(Overseer.java:561) at org.apache.solr.cloud.OverseerElectionContext.runLeaderProcess(ElectionContext.java:739) at org.apache.solr.cloud.LeaderElector.runIamLeaderProcess(LeaderElector.java:171) at
[JENKINS] Lucene-Solr-SmokeRelease-master - Build # 1196 - Failure
Build: https://builds.apache.org/job/Lucene-Solr-SmokeRelease-master/1196/ No tests ran. Build Log: [...truncated 23440 lines...] [asciidoctor:convert] asciidoctor: ERROR: about-this-guide.adoc: line 1: invalid part, must have at least one section (e.g., chapter, appendix, etc.) [asciidoctor:convert] asciidoctor: ERROR: solr-glossary.adoc: line 1: invalid part, must have at least one section (e.g., chapter, appendix, etc.) [java] Processed 2443 links (1994 relative) to 3215 anchors in 248 files [echo] Validated Links & Anchors via: /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/build/solr-ref-guide/bare-bones-html/ -dist-changes: [copy] Copying 4 files to /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/package/changes package: -unpack-solr-tgz: -ensure-solr-tgz-exists: [mkdir] Created dir: /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/build/solr.tgz.unpacked [untar] Expanding: /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/package/solr-8.0.0.tgz into /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/build/solr.tgz.unpacked generate-maven-artifacts: resolve: resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml resolve: resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail:
[jira] [Commented] (LUCENE-8575) Improve toString() in SegmentInfo
[ https://issues.apache.org/jira/browse/LUCENE-8575?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16704322#comment-16704322 ] Lucene/Solr QA commented on LUCENE-8575: | (x) *{color:red}-1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || || || || || {color:brown} Prechecks {color} || | {color:red}-1{color} | {color:red} test4tests {color} | {color:red} 0m 0s{color} | {color:red} The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. {color} | || || || || {color:brown} master Compile Tests {color} || | {color:green}+1{color} | {color:green} compile {color} | {color:green} 0m 53s{color} | {color:green} master passed {color} | || || || || {color:brown} Patch Compile Tests {color} || | {color:green}+1{color} | {color:green} compile {color} | {color:green} 0m 24s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 0m 24s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} Release audit (RAT) {color} | {color:green} 0m 23s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} Check forbidden APIs {color} | {color:green} 0m 19s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} Validate source patterns {color} | {color:green} 0m 19s{color} | {color:green} the patch passed {color} | || || || || {color:brown} Other Tests {color} || | {color:green}+1{color} | {color:green} unit {color} | {color:green} 27m 52s{color} | {color:green} core in the patch passed. {color} | | {color:green}+1{color} | {color:green} unit {color} | {color:green} 3m 50s{color} | {color:green} test-framework in the patch passed. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 37m 41s{color} | {color:black} {color} | \\ \\ || Subsystem || Report/Notes || | JIRA Issue | LUCENE-8575 | | JIRA Patch URL | https://issues.apache.org/jira/secure/attachment/12949850/LUCENE-8575.patch | | Optional Tests | compile javac unit ratsources checkforbiddenapis validatesourcepatterns | | uname | Linux lucene2-us-west.apache.org 4.4.0-112-generic #135-Ubuntu SMP Fri Jan 19 11:48:36 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | ant | | Personality | /home/jenkins/jenkins-slave/workspace/PreCommit-LUCENE-Build/sourcedir/dev-tools/test-patch/lucene-solr-yetus-personality.sh | | git revision | master / 75b1831 | | ant | version: Apache Ant(TM) version 1.9.6 compiled on July 20 2018 | | Default Java | 1.8.0_191 | | Test Results | https://builds.apache.org/job/PreCommit-LUCENE-Build/128/testReport/ | | modules | C: lucene/core lucene/test-framework U: lucene | | Console output | https://builds.apache.org/job/PreCommit-LUCENE-Build/128/console | | Powered by | Apache Yetus 0.7.0 http://yetus.apache.org | This message was automatically generated. > Improve toString() in SegmentInfo > - > > Key: LUCENE-8575 > URL: https://issues.apache.org/jira/browse/LUCENE-8575 > Project: Lucene - Core > Issue Type: Improvement > Components: core/index >Reporter: Namgyu Kim >Priority: Major > Attachments: LUCENE-8575.patch, LUCENE-8575.patch > > > I saw the following code in SegmentInfo class. > {code:java} > // TODO: we could append toString of attributes() here? > {code} > Of course, we can. > > So I wrote a code for that part. > {code:java} > public String toString(int delCount) { > StringBuilder s = new StringBuilder(); > s.append(name).append('(').append(version == null ? "?" : > version).append(')').append(':'); > char cfs = getUseCompoundFile() ? 'c' : 'C'; > s.append(cfs); > s.append(maxDoc); > if (delCount != 0) { > s.append('/').append(delCount); > } > if (indexSort != null) { > s.append(":[indexSort="); > s.append(indexSort); > s.append(']'); > } > // New Code > if (!diagnostics.isEmpty()) { > s.append(":[diagnostics="); > for (Map.Entry entry : diagnostics.entrySet()) > > s.append("<").append(entry.getKey()).append(",").append(entry.getValue()).append(">,"); > s.setLength(s.length() - 1); > s.append(']'); > } > // New Code > if (!attributes.isEmpty()) { > s.append(":[attributes="); > for (Map.Entry entry : attributes.entrySet()) > > s.append("<").append(entry.getKey()).append(",").append(entry.getValue()).append(">,"); > s.setLength(s.length() - 1); > s.append(']'); > } > return s.toString(); > } > {code} > -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail:
[jira] [Commented] (LUCENE-8542) Provide the LeafSlice to CollectorManager.newCollector to save memory on small index slices
[ https://issues.apache.org/jira/browse/LUCENE-8542?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16704304#comment-16704304 ] Lucene/Solr QA commented on LUCENE-8542: | (/) *{color:green}+1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || || || || || {color:brown} Prechecks {color} || | {color:green}+1{color} | {color:green} test4tests {color} | {color:green} 0m 0s{color} | {color:green} The patch appears to include 1 new or modified test files. {color} | || || || || {color:brown} master Compile Tests {color} || | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 38s{color} | {color:green} master passed {color} | || || || || {color:brown} Patch Compile Tests {color} || | {color:green}+1{color} | {color:green} compile {color} | {color:green} 0m 51s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 0m 51s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} Release audit (RAT) {color} | {color:green} 0m 30s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} Check forbidden APIs {color} | {color:green} 0m 20s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} Validate source patterns {color} | {color:green} 0m 20s{color} | {color:green} the patch passed {color} | || || || || {color:brown} Other Tests {color} || | {color:green}+1{color} | {color:green} unit {color} | {color:green} 30m 12s{color} | {color:green} core in the patch passed. {color} | | {color:green}+1{color} | {color:green} unit {color} | {color:green} 3m 10s{color} | {color:green} facet in the patch passed. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 40m 30s{color} | {color:black} {color} | \\ \\ || Subsystem || Report/Notes || | JIRA Issue | LUCENE-8542 | | JIRA Patch URL | https://issues.apache.org/jira/secure/attachment/12950019/LUCENE-8542.patch | | Optional Tests | compile javac unit ratsources checkforbiddenapis validatesourcepatterns | | uname | Linux lucene2-us-west.apache.org 4.4.0-112-generic #135-Ubuntu SMP Fri Jan 19 11:48:36 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | ant | | Personality | /home/jenkins/jenkins-slave/workspace/PreCommit-LUCENE-Build/sourcedir/dev-tools/test-patch/lucene-solr-yetus-personality.sh | | git revision | master / 75b1831 | | ant | version: Apache Ant(TM) version 1.9.6 compiled on July 20 2018 | | Default Java | 1.8.0_191 | | Test Results | https://builds.apache.org/job/PreCommit-LUCENE-Build/127/testReport/ | | modules | C: lucene lucene/core lucene/facet U: lucene | | Console output | https://builds.apache.org/job/PreCommit-LUCENE-Build/127/console | | Powered by | Apache Yetus 0.7.0 http://yetus.apache.org | This message was automatically generated. > Provide the LeafSlice to CollectorManager.newCollector to save memory on > small index slices > --- > > Key: LUCENE-8542 > URL: https://issues.apache.org/jira/browse/LUCENE-8542 > Project: Lucene - Core > Issue Type: Improvement > Components: core/search >Reporter: Christoph Kaser >Priority: Minor > Attachments: LUCENE-8542.patch > > > I have an index consisting of 44 million documents spread across 60 segments. > When I run a query against this index with a huge number of results requested > (e.g. 5 million), this query uses more than 5 GB of heap if the IndexSearch > was configured to use an ExecutorService. > (I know this kind of query is fairly unusual and it would be better to use > paging and searchAfter, but our architecture does not allow this at the > moment.) > The reason for the huge memory requirement is that the search [will create a > TopScoreDocCollector for each > segment|https://github.com/apache/lucene-solr/blob/master/lucene/core/src/java/org/apache/lucene/search/IndexSearcher.java#L404], > each one with numHits = 5 million. This is fine for the large segments, but > many of those segments are fairly small and only contain several thousand > documents. This wastes a huge amount of memory for queries with large values > of numHits on indices with many segments. > Therefore, I propose to change the CollectorManager - interface in the > following way: > * change the method newCollector to accept a parameter LeafSlice that can be > used to determine the total count of documents in the LeafSlice > * Maybe, in order to remain backwards compatible, it would be possible to > introduce this as a new method with a default implementation that calls the > old method - otherwise, it probably has to wait for Lucene 8? > * This can then be used to cap numHits for each
[JENKINS] Lucene-Solr-7.x-Linux (64bit/jdk-9.0.4) - Build # 3159 - Still Unstable!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.x-Linux/3159/ Java: 64bit/jdk-9.0.4 -XX:-UseCompressedOops -XX:+UseSerialGC 5 tests failed. FAILED: org.apache.solr.cloud.TestTlogReplica.testRecovery Error Message: Can not find doc 7 in https://127.0.0.1:42935/solr Stack Trace: java.lang.AssertionError: Can not find doc 7 in https://127.0.0.1:42935/solr at __randomizedtesting.SeedInfo.seed([FD68426C345E6399:3C983BC0190EA93E]:0) at org.junit.Assert.fail(Assert.java:93) at org.junit.Assert.assertTrue(Assert.java:43) at org.junit.Assert.assertNotNull(Assert.java:526) at org.apache.solr.cloud.TestTlogReplica.checkRTG(TestTlogReplica.java:902) at org.apache.solr.cloud.TestTlogReplica.testRecovery(TestTlogReplica.java:567) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:564) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at java.base/java.lang.Thread.run(Thread.java:844) FAILED: org.apache.solr.cloud.TestTlogReplica.testRecovery Error Message: Can not find doc 7 in https://127.0.0.1:43821/solr Stack Trace:
[JENKINS] Lucene-Solr-master-MacOSX (64bit/jdk-9) - Build # 4955 - Unstable!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-MacOSX/4955/ Java: 64bit/jdk-9 -XX:-UseCompressedOops -XX:+UseSerialGC 88 tests failed. FAILED: junit.framework.TestSuite.org.apache.solr.schema.TestICUCollationFieldOptions Error Message: 2 threads leaked from SUITE scope at org.apache.solr.schema.TestICUCollationFieldOptions: 1) Thread[id=18, name=SolrRrdBackendFactory-7-thread-1, state=TIMED_WAITING, group=TGRP-TestICUCollationFieldOptions] at java.base@9/jdk.internal.misc.Unsafe.park(Native Method) at java.base@9/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234) at java.base@9/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2104) at java.base@9/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1131) at java.base@9/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:848) at java.base@9/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1092) at java.base@9/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1152) at java.base@9/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:641) at java.base@9/java.lang.Thread.run(Thread.java:844)2) Thread[id=19, name=MetricsHistoryHandler-8-thread-1, state=TIMED_WAITING, group=TGRP-TestICUCollationFieldOptions] at java.base@9/jdk.internal.misc.Unsafe.park(Native Method) at java.base@9/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234) at java.base@9/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2104) at java.base@9/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1131) at java.base@9/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:848) at java.base@9/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1092) at java.base@9/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1152) at java.base@9/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:641) at java.base@9/java.lang.Thread.run(Thread.java:844) Stack Trace: com.carrotsearch.randomizedtesting.ThreadLeakError: 2 threads leaked from SUITE scope at org.apache.solr.schema.TestICUCollationFieldOptions: 1) Thread[id=18, name=SolrRrdBackendFactory-7-thread-1, state=TIMED_WAITING, group=TGRP-TestICUCollationFieldOptions] at java.base@9/jdk.internal.misc.Unsafe.park(Native Method) at java.base@9/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234) at java.base@9/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2104) at java.base@9/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1131) at java.base@9/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:848) at java.base@9/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1092) at java.base@9/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1152) at java.base@9/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:641) at java.base@9/java.lang.Thread.run(Thread.java:844) 2) Thread[id=19, name=MetricsHistoryHandler-8-thread-1, state=TIMED_WAITING, group=TGRP-TestICUCollationFieldOptions] at java.base@9/jdk.internal.misc.Unsafe.park(Native Method) at java.base@9/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234) at java.base@9/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2104) at java.base@9/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1131) at java.base@9/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:848) at java.base@9/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1092) at java.base@9/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1152) at java.base@9/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:641) at java.base@9/java.lang.Thread.run(Thread.java:844) at __randomizedtesting.SeedInfo.seed([88F115D5295FC7AA]:0) FAILED: junit.framework.TestSuite.org.apache.solr.schema.TestICUCollationFieldOptions Error Message: There are still zombie threads that
[JENKINS] Lucene-Solr-repro - Build # 2047 - Unstable
Build: https://builds.apache.org/job/Lucene-Solr-repro/2047/ [...truncated 29 lines...] [repro] Jenkins log URL: https://builds.apache.org/job/Lucene-Solr-NightlyTests-7.x/390/consoleText [repro] Revision: 5e2db9eca3bcb897673855ef9392735ab6c64186 [repro] Ant options: -Dtests.multiplier=2 -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt [repro] Repro line: ant test -Dtestcase=RollingRestartTest -Dtests.method=test -Dtests.seed=538E2CEB11D6E6D3 -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt -Dtests.locale=lv -Dtests.timezone=US/Aleutian -Dtests.asserts=true -Dtests.file.encoding=US-ASCII [repro] Repro line: ant test -Dtestcase=IndexSizeTriggerTest -Dtests.method=testSplitIntegration -Dtests.seed=538E2CEB11D6E6D3 -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt -Dtests.locale=sr-Latn-BA -Dtests.timezone=US/Pacific-New -Dtests.asserts=true -Dtests.file.encoding=US-ASCII [repro] Repro line: ant test -Dtestcase=LIROnShardRestartTest -Dtests.method=testAllReplicasInLIR -Dtests.seed=538E2CEB11D6E6D3 -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt -Dtests.locale=es-CR -Dtests.timezone=America/Phoenix -Dtests.asserts=true -Dtests.file.encoding=US-ASCII [repro] Repro line: ant test -Dtestcase=RestartWhileUpdatingTest -Dtests.method=test -Dtests.seed=538E2CEB11D6E6D3 -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt -Dtests.locale=sr-CS -Dtests.timezone=Europe/Bucharest -Dtests.asserts=true -Dtests.file.encoding=US-ASCII [repro] Repro line: ant test -Dtestcase=RestartWhileUpdatingTest -Dtests.seed=538E2CEB11D6E6D3 -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt -Dtests.locale=sr-CS -Dtests.timezone=Europe/Bucharest -Dtests.asserts=true -Dtests.file.encoding=US-ASCII [repro] Repro line: ant test -Dtestcase=CloudSolrClientTest -Dtests.method=testRouting -Dtests.seed=EFD828841379AA4 -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt -Dtests.locale=vi-VN -Dtests.timezone=EAT -Dtests.asserts=true -Dtests.file.encoding=US-ASCII [repro] Repro line: ant test -Dtestcase=CloudSolrClientTest -Dtests.method=testVersionsAreReturned -Dtests.seed=EFD828841379AA4 -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt -Dtests.locale=vi-VN -Dtests.timezone=EAT -Dtests.asserts=true -Dtests.file.encoding=US-ASCII [repro] git rev-parse --abbrev-ref HEAD [repro] git rev-parse HEAD [repro] Initial local git branch/revision: 75b183196798232aa6f2dcb117f309119053 [repro] git fetch [repro] git checkout 5e2db9eca3bcb897673855ef9392735ab6c64186 [...truncated 2 lines...] [repro] git merge --ff-only [...truncated 1 lines...] [repro] ant clean [...truncated 6 lines...] [repro] Test suites by module: [repro]solr/core [repro] RollingRestartTest [repro] LIROnShardRestartTest [repro] RestartWhileUpdatingTest [repro] IndexSizeTriggerTest [repro]solr/solrj [repro] CloudSolrClientTest [repro] ant compile-test [...truncated 3587 lines...] [repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=20 -Dtests.class="*.RollingRestartTest|*.LIROnShardRestartTest|*.RestartWhileUpdatingTest|*.IndexSizeTriggerTest" -Dtests.showOutput=onerror -Dtests.multiplier=2 -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt -Dtests.seed=538E2CEB11D6E6D3 -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt -Dtests.locale=lv -Dtests.timezone=US/Aleutian -Dtests.asserts=true -Dtests.file.encoding=US-ASCII [...truncated 122772 lines...] [repro] Setting last failure code to 256 [repro] ant compile-test [...truncated 454 lines...] [repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=5 -Dtests.class="*.CloudSolrClientTest" -Dtests.showOutput=onerror -Dtests.multiplier=2
[JENKINS] Lucene-Solr-BadApples-Tests-7.x - Build # 228 - Still Unstable
Build: https://builds.apache.org/job/Lucene-Solr-BadApples-Tests-7.x/228/ 7 tests failed. FAILED: org.apache.solr.cloud.BasicDistributedZkTest.test Error Message: Timeout occured while waiting response from server at: http://127.0.0.1:40708/collection1 Stack Trace: org.apache.solr.client.solrj.SolrServerException: Timeout occured while waiting response from server at: http://127.0.0.1:40708/collection1 at __randomizedtesting.SeedInfo.seed([2B85C6E3BCB5E12D:A3D1F93912498CD5]:0) at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:654) at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:255) at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:244) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:194) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:211) at org.apache.solr.BaseDistributedSearchTestCase.add(BaseDistributedSearchTestCase.java:509) at org.apache.solr.BaseDistributedSearchTestCase.indexDoc(BaseDistributedSearchTestCase.java:498) at org.apache.solr.cloud.BasicDistributedZkTest.test(BasicDistributedZkTest.java:176) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1010) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:985) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at
[JENKINS] Lucene-Solr-http2-Linux (64bit/jdk1.8.0_172) - Build # 13 - Still Failing!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-http2-Linux/13/ Java: 64bit/jdk1.8.0_172 -XX:-UseCompressedOops -XX:+UseG1GC All tests passed Build Log: [...truncated 25440 lines...] check-licenses: [echo] License check under: /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene [licenses] MISSING sha1 checksum file for: /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/replicator/lib/jetty-continuation-9.4.14.v20181114.jar [licenses] EXPECTED sha1 checksum file : /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/licenses/jetty-continuation-9.4.14.v20181114.jar.sha1 [licenses] MISSING sha1 checksum file for: /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/replicator/lib/jetty-http-9.4.14.v20181114.jar [licenses] EXPECTED sha1 checksum file : /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/licenses/jetty-http-9.4.14.v20181114.jar.sha1 [licenses] MISSING sha1 checksum file for: /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/replicator/lib/jetty-io-9.4.14.v20181114.jar [licenses] EXPECTED sha1 checksum file : /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/licenses/jetty-io-9.4.14.v20181114.jar.sha1 [licenses] MISSING sha1 checksum file for: /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/replicator/lib/jetty-server-9.4.14.v20181114.jar [licenses] EXPECTED sha1 checksum file : /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/licenses/jetty-server-9.4.14.v20181114.jar.sha1 [licenses] MISSING sha1 checksum file for: /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/replicator/lib/jetty-servlet-9.4.14.v20181114.jar [licenses] EXPECTED sha1 checksum file : /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/licenses/jetty-servlet-9.4.14.v20181114.jar.sha1 [...truncated 3 lines...] BUILD FAILED /home/jenkins/workspace/Lucene-Solr-http2-Linux/build.xml:633: The following error occurred while executing this line: /home/jenkins/workspace/Lucene-Solr-http2-Linux/build.xml:117: The following error occurred while executing this line: /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/build.xml:90: The following error occurred while executing this line: /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/tools/custom-tasks.xml:62: License check failed. Check the logs. If you recently modified ivy-versions.properties or any module's ivy.xml, make sure you run "ant clean-jars jar-checksums" before running precommit. Total time: 85 minutes 51 seconds Build step 'Invoke Ant' marked build as failure Archiving artifacts Setting ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2 [WARNINGS] Skipping publisher since build result is FAILURE Recording test results Setting ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2 Email was triggered for: Failure - Any Sending email for trigger: Failure - Any Setting ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2 Setting ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2 Setting ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2 Setting ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2 - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-12209) add Paging Streaming Expression
[ https://issues.apache.org/jira/browse/SOLR-12209?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16704157#comment-16704157 ] Lucene/Solr QA commented on SOLR-12209: --- | (/) *{color:green}+1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || || || || || {color:brown} Prechecks {color} || | {color:green}+1{color} | {color:green} test4tests {color} | {color:green} 0m 0s{color} | {color:green} The patch appears to include 2 new or modified test files. {color} | || || || || {color:brown} master Compile Tests {color} || | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 21s{color} | {color:green} master passed {color} | || || || || {color:brown} Patch Compile Tests {color} || | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 13s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 1m 13s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} Release audit (RAT) {color} | {color:green} 1m 13s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} Check forbidden APIs {color} | {color:green} 1m 13s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} Validate source patterns {color} | {color:green} 1m 13s{color} | {color:green} the patch passed {color} | || || || || {color:brown} Other Tests {color} || | {color:green}+1{color} | {color:green} unit {color} | {color:green} 4m 17s{color} | {color:green} solrj in the patch passed. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 9m 40s{color} | {color:black} {color} | \\ \\ || Subsystem || Report/Notes || | JIRA Issue | SOLR-12209 | | JIRA Patch URL | https://issues.apache.org/jira/secure/attachment/12950010/SOLR-12209.patch | | Optional Tests | compile javac unit ratsources checkforbiddenapis validatesourcepatterns | | uname | Linux lucene1-us-west 4.4.0-137-generic #163~14.04.1-Ubuntu SMP Mon Sep 24 17:14:57 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | ant | | Personality | /home/jenkins/jenkins-slave/workspace/PreCommit-SOLR-Build/sourcedir/dev-tools/test-patch/lucene-solr-yetus-personality.sh | | git revision | master / 75b1831 | | ant | version: Apache Ant(TM) version 1.9.3 compiled on July 24 2018 | | Default Java | 1.8.0_191 | | Test Results | https://builds.apache.org/job/PreCommit-SOLR-Build/236/testReport/ | | modules | C: solr/solrj U: solr/solrj | | Console output | https://builds.apache.org/job/PreCommit-SOLR-Build/236/console | | Powered by | Apache Yetus 0.7.0 http://yetus.apache.org | This message was automatically generated. > add Paging Streaming Expression > --- > > Key: SOLR-12209 > URL: https://issues.apache.org/jira/browse/SOLR-12209 > Project: Solr > Issue Type: New Feature > Security Level: Public(Default Security Level. Issues are Public) > Components: streaming expressions >Reporter: mosh >Priority: Major > Attachments: SOLR-12209.patch, SOLR-12209.patch, SOLR-12209.patch, > SOLR-12209.patch > > > Currently the closest streaming expression that allows some sort of > pagination is top. > I propose we add a new streaming expression, which is based on the > RankedStream class to add offset to the stream. currently it can only be done > in code by reading the stream until the desired offset is reached. > The new expression will be used as such: > {{paging(rows=3, search(collection1, q="*:*", qt="/export", > fl="id,a_s,a_i,a_f", sort="a_f desc, a_i desc"), sort="a_f asc, a_i asc", > start=100)}} > {{this will offset the returned stream by 100 documents}} > > [~joel.bernstein] what to you think? -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-http2-Windows (64bit/jdk1.8.0_172) - Build # 6 - Still Failing!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-http2-Windows/6/ Java: 64bit/jdk1.8.0_172 -XX:-UseCompressedOops -XX:+UseParallelGC 4 tests failed. FAILED: org.apache.solr.cloud.MissingSegmentRecoveryTest.testLeaderRecovery Error Message: IOException occured when talking to server at: https://127.0.0.1:63624/solr Stack Trace: org.apache.solr.client.solrj.SolrServerException: IOException occured when talking to server at: https://127.0.0.1:63624/solr at __randomizedtesting.SeedInfo.seed([29A2D5CE81E865E5:79F74DCDD8C9D3F8]:0) at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:657) at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:255) at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:244) at org.apache.solr.client.solrj.impl.LBSolrClient.doRequest(LBSolrClient.java:367) at org.apache.solr.client.solrj.impl.LBSolrClient.request(LBSolrClient.java:295) at org.apache.solr.client.solrj.impl.LBHttpSolrClient.request(LBHttpSolrClient.java:213) at org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1107) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:884) at org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:817) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:196) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:213) at org.apache.solr.cloud.MissingSegmentRecoveryTest.teardown(MissingSegmentRecoveryTest.java:83) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:993) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at
[JENKINS] Lucene-Solr-SmokeRelease-7.6 - Build # 16 - Still Failing
Build: https://builds.apache.org/job/Lucene-Solr-SmokeRelease-7.6/16/ No tests ran. Build Log: [...truncated 23437 lines...] [asciidoctor:convert] asciidoctor: ERROR: about-this-guide.adoc: line 1: invalid part, must have at least one section (e.g., chapter, appendix, etc.) [asciidoctor:convert] asciidoctor: ERROR: solr-glossary.adoc: line 1: invalid part, must have at least one section (e.g., chapter, appendix, etc.) [java] Processed 2443 links (1994 relative) to 3203 anchors in 247 files [echo] Validated Links & Anchors via: /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.6/solr/build/solr-ref-guide/bare-bones-html/ -dist-changes: [copy] Copying 4 files to /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.6/solr/package/changes package: -unpack-solr-tgz: -ensure-solr-tgz-exists: [mkdir] Created dir: /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.6/solr/build/solr.tgz.unpacked [untar] Expanding: /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.6/solr/package/solr-7.6.0.tgz into /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.6/solr/build/solr.tgz.unpacked generate-maven-artifacts: resolve: resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.6/lucene/top-level-ivy-settings.xml resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.6/lucene/top-level-ivy-settings.xml resolve: resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.6/lucene/top-level-ivy-settings.xml resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.6/lucene/top-level-ivy-settings.xml resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.6/lucene/top-level-ivy-settings.xml resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.6/lucene/top-level-ivy-settings.xml resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.6/lucene/top-level-ivy-settings.xml resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.6/lucene/top-level-ivy-settings.xml resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.6/lucene/top-level-ivy-settings.xml resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.6/lucene/top-level-ivy-settings.xml resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.6/lucene/top-level-ivy-settings.xml resolve: ivy-availability-check: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 0. -ivy-fail-disallowed-ivy-version: ivy-fail: ivy-configure: [ivy:configure] :: loading settings ::
[JENKINS] Lucene-Solr-master-Linux (64bit/jdk-10.0.1) - Build # 23277 - Unstable!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/23277/ Java: 64bit/jdk-10.0.1 -XX:-UseCompressedOops -XX:+UseConcMarkSweepGC 124 tests failed. FAILED: junit.framework.TestSuite.org.apache.solr.analysis.TestFoldingMultitermExtrasQuery Error Message: 2 threads leaked from SUITE scope at org.apache.solr.analysis.TestFoldingMultitermExtrasQuery: 1) Thread[id=21, name=MetricsHistoryHandler-8-thread-1, state=TIMED_WAITING, group=TGRP-TestFoldingMultitermExtrasQuery] at java.base@10.0.1/jdk.internal.misc.Unsafe.park(Native Method) at java.base@10.0.1/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234) at java.base@10.0.1/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2117) at java.base@10.0.1/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1182) at java.base@10.0.1/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:899) at java.base@10.0.1/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1061) at java.base@10.0.1/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1121) at java.base@10.0.1/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) at java.base@10.0.1/java.lang.Thread.run(Thread.java:844)2) Thread[id=20, name=SolrRrdBackendFactory-7-thread-1, state=TIMED_WAITING, group=TGRP-TestFoldingMultitermExtrasQuery] at java.base@10.0.1/jdk.internal.misc.Unsafe.park(Native Method) at java.base@10.0.1/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234) at java.base@10.0.1/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2117) at java.base@10.0.1/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1182) at java.base@10.0.1/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:899) at java.base@10.0.1/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1061) at java.base@10.0.1/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1121) at java.base@10.0.1/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) at java.base@10.0.1/java.lang.Thread.run(Thread.java:844) Stack Trace: com.carrotsearch.randomizedtesting.ThreadLeakError: 2 threads leaked from SUITE scope at org.apache.solr.analysis.TestFoldingMultitermExtrasQuery: 1) Thread[id=21, name=MetricsHistoryHandler-8-thread-1, state=TIMED_WAITING, group=TGRP-TestFoldingMultitermExtrasQuery] at java.base@10.0.1/jdk.internal.misc.Unsafe.park(Native Method) at java.base@10.0.1/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234) at java.base@10.0.1/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2117) at java.base@10.0.1/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1182) at java.base@10.0.1/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:899) at java.base@10.0.1/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1061) at java.base@10.0.1/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1121) at java.base@10.0.1/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) at java.base@10.0.1/java.lang.Thread.run(Thread.java:844) 2) Thread[id=20, name=SolrRrdBackendFactory-7-thread-1, state=TIMED_WAITING, group=TGRP-TestFoldingMultitermExtrasQuery] at java.base@10.0.1/jdk.internal.misc.Unsafe.park(Native Method) at java.base@10.0.1/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234) at java.base@10.0.1/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2117) at java.base@10.0.1/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1182) at java.base@10.0.1/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:899) at java.base@10.0.1/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1061) at java.base@10.0.1/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1121) at java.base@10.0.1/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) at
[jira] [Updated] (SOLR-13024) ValueSourceAugmenter - avoid creating new FunctionValues per doc
[ https://issues.apache.org/jira/browse/SOLR-13024?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yonik Seeley updated SOLR-13024: Summary: ValueSourceAugmenter - avoid creating new FunctionValues per doc (was: ValueSourceAugmenter ) > ValueSourceAugmenter - avoid creating new FunctionValues per doc > - > > Key: SOLR-13024 > URL: https://issues.apache.org/jira/browse/SOLR-13024 > Project: Solr > Issue Type: Improvement > Security Level: Public(Default Security Level. Issues are Public) > Components: search >Affects Versions: 7.0 >Reporter: Yonik Seeley >Priority: Major > > The cutover to iterators in LUCENE-7407 caused ValueSourceAugmenter (which > handles functions in the "fl" param along side other fields) resulted in > FunctionValues being re-retrieved for every document. > Caching could cut that in half, but we should really retrieve a window at a > time in order for best performance. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-8578) Can I do a lot of analysis on one field at the time of indexing?
[ https://issues.apache.org/jira/browse/LUCENE-8578?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16704065#comment-16704065 ] YOO JEONGIN commented on LUCENE-8578: - [~jpountz] , Thank you. I will ask you through the channel you provided. Please acknowledge. Thank you. > Can I do a lot of analysis on one field at the time of indexing? > > > Key: LUCENE-8578 > URL: https://issues.apache.org/jira/browse/LUCENE-8578 > Project: Lucene - Core > Issue Type: Improvement >Reporter: YOO JEONGIN >Priority: Major > > Hello > I have a question about index schemas. > 1) Can I do various analysis on one field? > For example, you can analyze the 'title' field with multiple tokenizers, and > merge the analysis into a single field. > 2) You can collect multiple fields in one field using 'copyField' function. > However, several fields have different data attributes (eg, category fields, > text fields, etc.) _) > At this time, I would like to analyze each field differently. > Do you have these features in version 7.5? Is there any kind of shortcut to > do these similar functions? > Thank you for your advice. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-13024) ValueSourceAugmenter
[ https://issues.apache.org/jira/browse/SOLR-13024?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16704054#comment-16704054 ] Yonik Seeley commented on SOLR-13024: - The change from LUCENE-7407: {code} git show f7aa200d40 ./solr/core/src/java/org/apache/solr/response/transform/ValueSourceAugmenter.java commit f7aa200d406dbd05a35d6116198302d90b92cb29 Author: Mike McCandless Date: Wed Sep 21 09:41:41 2016 -0400 LUCENE-7407: switch doc values usage to an iterator API, based on DocIdSetIterator, instead of random acces, freeing codecs for future diff --git a/solr/core/src/java/org/apache/solr/response/transform/ValueSourceAugmenter.java b/solr/core/src/java/org/apache/solr/response index 9edf826e2c..c37dd80bfb 100644 --- a/solr/core/src/java/org/apache/solr/response/transform/ValueSourceAugmenter.java +++ b/solr/core/src/java/org/apache/solr/response/transform/ValueSourceAugmenter.java @@ -65,7 +65,6 @@ public class ValueSourceAugmenter extends DocTransformer try { searcher = context.getSearcher(); readerContexts = searcher.getIndexReader().leaves(); - docValuesArr = new FunctionValues[readerContexts.size()]; fcontext = ValueSource.newContext(searcher); this.valueSource.createWeight(fcontext, searcher); } catch (IOException e) { @@ -76,7 +75,6 @@ public class ValueSourceAugmenter extends DocTransformer Map fcontext; SolrIndexSearcher searcher; List readerContexts; - FunctionValues docValuesArr[]; @Override public void transform(SolrDocument doc, int docid, float score) { @@ -87,11 +85,7 @@ public class ValueSourceAugmenter extends DocTransformer // TODO: calculate this stuff just once across diff functions int idx = ReaderUtil.subIndex(docid, readerContexts); LeafReaderContext rcontext = readerContexts.get(idx); - FunctionValues values = docValuesArr[idx]; - if (values == null) { -docValuesArr[idx] = values = valueSource.getValues(fcontext, rcontext); - } - + FunctionValues values = valueSource.getValues(fcontext, rcontext); int localId = docid - rcontext.docBase; setValue(doc,values.objectVal(localId)); {code} > ValueSourceAugmenter > - > > Key: SOLR-13024 > URL: https://issues.apache.org/jira/browse/SOLR-13024 > Project: Solr > Issue Type: Improvement > Security Level: Public(Default Security Level. Issues are Public) > Components: search >Affects Versions: 7.0 >Reporter: Yonik Seeley >Priority: Major > > The cutover to iterators in LUCENE-7407 caused ValueSourceAugmenter (which > handles functions in the "fl" param along side other fields) resulted in > FunctionValues being re-retrieved for every document. > Caching could cut that in half, but we should really retrieve a window at a > time in order for best performance. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Created] (SOLR-13024) ValueSourceAugmenter
Yonik Seeley created SOLR-13024: --- Summary: ValueSourceAugmenter Key: SOLR-13024 URL: https://issues.apache.org/jira/browse/SOLR-13024 Project: Solr Issue Type: Improvement Security Level: Public (Default Security Level. Issues are Public) Components: search Affects Versions: 7.0 Reporter: Yonik Seeley The cutover to iterators in LUCENE-7407 caused ValueSourceAugmenter (which handles functions in the "fl" param along side other fields) resulted in FunctionValues being re-retrieved for every document. Caching could cut that in half, but we should really retrieve a window at a time in order for best performance. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS-EA] Lucene-Solr-master-Windows (64bit/jdk-12-ea+12) - Build # 7640 - Still Unstable!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Windows/7640/ Java: 64bit/jdk-12-ea+12 -XX:+UseCompressedOops -XX:+UseParallelGC 92 tests failed. FAILED: junit.framework.TestSuite.org.apache.solr.analysis.TestFoldingMultitermExtrasQuery Error Message: 2 threads leaked from SUITE scope at org.apache.solr.analysis.TestFoldingMultitermExtrasQuery: 1) Thread[id=62, name=SolrRrdBackendFactory-7-thread-1, state=TIMED_WAITING, group=TGRP-TestFoldingMultitermExtrasQuery] at java.base@12-ea/jdk.internal.misc.Unsafe.park(Native Method) at java.base@12-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234) at java.base@12-ea/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2123) at java.base@12-ea/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1182) at java.base@12-ea/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:899) at java.base@12-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1054) at java.base@12-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1114) at java.base@12-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base@12-ea/java.lang.Thread.run(Thread.java:835)2) Thread[id=63, name=MetricsHistoryHandler-8-thread-1, state=TIMED_WAITING, group=TGRP-TestFoldingMultitermExtrasQuery] at java.base@12-ea/jdk.internal.misc.Unsafe.park(Native Method) at java.base@12-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234) at java.base@12-ea/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2123) at java.base@12-ea/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1182) at java.base@12-ea/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:899) at java.base@12-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1054) at java.base@12-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1114) at java.base@12-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base@12-ea/java.lang.Thread.run(Thread.java:835) Stack Trace: com.carrotsearch.randomizedtesting.ThreadLeakError: 2 threads leaked from SUITE scope at org.apache.solr.analysis.TestFoldingMultitermExtrasQuery: 1) Thread[id=62, name=SolrRrdBackendFactory-7-thread-1, state=TIMED_WAITING, group=TGRP-TestFoldingMultitermExtrasQuery] at java.base@12-ea/jdk.internal.misc.Unsafe.park(Native Method) at java.base@12-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234) at java.base@12-ea/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2123) at java.base@12-ea/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1182) at java.base@12-ea/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:899) at java.base@12-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1054) at java.base@12-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1114) at java.base@12-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base@12-ea/java.lang.Thread.run(Thread.java:835) 2) Thread[id=63, name=MetricsHistoryHandler-8-thread-1, state=TIMED_WAITING, group=TGRP-TestFoldingMultitermExtrasQuery] at java.base@12-ea/jdk.internal.misc.Unsafe.park(Native Method) at java.base@12-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234) at java.base@12-ea/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2123) at java.base@12-ea/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1182) at java.base@12-ea/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:899) at java.base@12-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1054) at java.base@12-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1114) at java.base@12-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base@12-ea/java.lang.Thread.run(Thread.java:835) at
[JENKINS] Lucene-Solr-7.x-Linux (64bit/jdk-10.0.1) - Build # 3158 - Unstable!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.x-Linux/3158/ Java: 64bit/jdk-10.0.1 -XX:+UseCompressedOops -XX:+UseSerialGC 6 tests failed. FAILED: org.apache.solr.cloud.TestTlogReplica.testRecovery Error Message: Can not find doc 7 in https://127.0.0.1:44749/solr Stack Trace: java.lang.AssertionError: Can not find doc 7 in https://127.0.0.1:44749/solr at __randomizedtesting.SeedInfo.seed([D1957690C8BE1242:10650F3CE5EED8E5]:0) at org.junit.Assert.fail(Assert.java:93) at org.junit.Assert.assertTrue(Assert.java:43) at org.junit.Assert.assertNotNull(Assert.java:526) at org.apache.solr.cloud.TestTlogReplica.checkRTG(TestTlogReplica.java:902) at org.apache.solr.cloud.TestTlogReplica.testRecovery(TestTlogReplica.java:567) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:564) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at java.base/java.lang.Thread.run(Thread.java:844) FAILED: org.apache.solr.cloud.TestTlogReplica.testRecovery Error Message: Can not find doc 7 in https://127.0.0.1:43869/solr Stack Trace:
[JENKINS] Lucene-Solr-Tests-http2 - Build # 3 - Still Failing
Build: https://builds.apache.org/job/Lucene-Solr-Tests-http2/3/ 1 tests failed. FAILED: org.apache.solr.client.solrj.TestLBHttp2SolrClient.testReliability Error Message: Idle timeout expired: 504/500 ms Stack Trace: org.apache.solr.client.solrj.SolrServerException: Idle timeout expired: 504/500 ms at __randomizedtesting.SeedInfo.seed([887C13E6EF6067C0:49B4CEA04E06B669]:0) at org.apache.solr.client.solrj.impl.Http2SolrClient.request(Http2SolrClient.java:401) at org.apache.solr.client.solrj.impl.Http2SolrClient.request(Http2SolrClient.java:718) at org.apache.solr.client.solrj.impl.LBSolrClient.request(LBSolrClient.java:614) at org.apache.solr.client.solrj.impl.LBSolrClient.request(LBSolrClient.java:590) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:196) at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:991) at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1006) at org.apache.solr.client.solrj.TestLBHttp2SolrClient.testReliability(TestLBHttp2SolrClient.java:221) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54) at
[JENKINS] Lucene-Solr-7.x-Solaris (64bit/jdk1.8.0) - Build # 937 - Unstable!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.x-Solaris/937/ Java: 64bit/jdk1.8.0 -XX:+UseCompressedOops -XX:+UseG1GC 1 tests failed. FAILED: org.apache.solr.client.solrj.io.stream.StreamDecoratorTest.testParallelCommitStream Error Message: expected:<5> but was:<3> Stack Trace: java.lang.AssertionError: expected:<5> but was:<3> at __randomizedtesting.SeedInfo.seed([134EEC3B2F4B46DE:33A48E3BB30AAB92]:0) at org.junit.Assert.fail(Assert.java:93) at org.junit.Assert.failNotEquals(Assert.java:647) at org.junit.Assert.assertEquals(Assert.java:128) at org.junit.Assert.assertEquals(Assert.java:472) at org.junit.Assert.assertEquals(Assert.java:456) at org.apache.solr.client.solrj.io.stream.StreamDecoratorTest.testParallelCommitStream(StreamDecoratorTest.java:3037) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at java.lang.Thread.run(Thread.java:748) Build Log: [...truncated 16244 lines...] [junit4] Suite:
[jira] [Commented] (LUCENE-8579) Don't run badapples when building a release
[ https://issues.apache.org/jira/browse/LUCENE-8579?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16703909#comment-16703909 ] Uwe Schindler commented on LUCENE-8579: --- +1 > Don't run badapples when building a release > --- > > Key: LUCENE-8579 > URL: https://issues.apache.org/jira/browse/LUCENE-8579 > Project: Lucene - Core > Issue Type: Improvement >Reporter: Adrien Grand >Priority: Major > > Nick pinged me because his attempt to build a release failed because of Solr > test failures. When looking at those, I noticed that a number of them were > known flaky test that are already tagged as bad apples, eg. > org.apache.solr.cloud.TestStressInPlaceUpdates.stressTest. > We should disable badapples in the script to build a release, ie. something > like > {code:python} > diff --git a/dev-tools/scripts/buildAndPushRelease.py > b/dev-tools/scripts/buildAndPushRelease.py > index 5a8f5cc..b557da2 100644 > --- a/dev-tools/scripts/buildAndPushRelease.py > +++ b/dev-tools/scripts/buildAndPushRelease.py > @@ -105,7 +105,7 @@ def prepare(root, version, gpgKeyID, gpgPassword): >print(' Check DOAP files') >checkDOAPfiles(version) > > - print(' ant clean test validate documentation-lint') > + print(' ant -Dtests.badapples=false clean test validate > documentation-lint') > - run('ant clean test validate documentation-lint') > + run('ant -Dtests.badapples=false clean test validate documentation-lint') > >open('rev.txt', mode='wb').write(rev.encode('UTF-8')) > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS-EA] Lucene-Solr-7.6-Linux (64bit/jdk-12-ea+12) - Build # 42 - Still Unstable!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.6-Linux/42/ Java: 64bit/jdk-12-ea+12 -XX:-UseCompressedOops -XX:+UseConcMarkSweepGC 23 tests failed. FAILED: org.apache.solr.cloud.TestMiniSolrCloudClusterSSL.testSslWithInvalidPeerName Error Message: Could not find collection:second_collection Stack Trace: java.lang.AssertionError: Could not find collection:second_collection at __randomizedtesting.SeedInfo.seed([FA7F8A9985F91EEC:ADCECF224505E1FD]:0) at org.junit.Assert.fail(Assert.java:93) at org.junit.Assert.assertTrue(Assert.java:43) at org.junit.Assert.assertNotNull(Assert.java:526) at org.apache.solr.cloud.AbstractDistribZkTestBase.waitForRecoveriesToFinish(AbstractDistribZkTestBase.java:155) at org.apache.solr.cloud.TestMiniSolrCloudClusterSSL.checkCreateCollection(TestMiniSolrCloudClusterSSL.java:263) at org.apache.solr.cloud.TestMiniSolrCloudClusterSSL.checkClusterWithCollectionCreations(TestMiniSolrCloudClusterSSL.java:249) at org.apache.solr.cloud.TestMiniSolrCloudClusterSSL.testSslWithInvalidPeerName(TestMiniSolrCloudClusterSSL.java:185) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at
[jira] [Commented] (LUCENE-8579) Don't run badapples when building a release
[ https://issues.apache.org/jira/browse/LUCENE-8579?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16703853#comment-16703853 ] Nicholas Knize commented on LUCENE-8579: +1 > Don't run badapples when building a release > --- > > Key: LUCENE-8579 > URL: https://issues.apache.org/jira/browse/LUCENE-8579 > Project: Lucene - Core > Issue Type: Improvement >Reporter: Adrien Grand >Priority: Major > > Nick pinged me because his attempt to build a release failed because of Solr > test failures. When looking at those, I noticed that a number of them were > known flaky test that are already tagged as bad apples, eg. > org.apache.solr.cloud.TestStressInPlaceUpdates.stressTest. > We should disable badapples in the script to build a release, ie. something > like > {code:python} > diff --git a/dev-tools/scripts/buildAndPushRelease.py > b/dev-tools/scripts/buildAndPushRelease.py > index 5a8f5cc..b557da2 100644 > --- a/dev-tools/scripts/buildAndPushRelease.py > +++ b/dev-tools/scripts/buildAndPushRelease.py > @@ -105,7 +105,7 @@ def prepare(root, version, gpgKeyID, gpgPassword): >print(' Check DOAP files') >checkDOAPfiles(version) > > - print(' ant clean test validate documentation-lint') > + print(' ant -Dtests.badapples=false clean test validate > documentation-lint') > - run('ant clean test validate documentation-lint') > + run('ant -Dtests.badapples=false clean test validate documentation-lint') > >open('rev.txt', mode='wb').write(rev.encode('UTF-8')) > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-8575) Improve toString() in SegmentInfo
[ https://issues.apache.org/jira/browse/LUCENE-8575?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16703836#comment-16703836 ] Adrien Grand commented on LUCENE-8575: -- Thanks [~danmuzi], it looks good to me! I'll merge tomorrow. > Improve toString() in SegmentInfo > - > > Key: LUCENE-8575 > URL: https://issues.apache.org/jira/browse/LUCENE-8575 > Project: Lucene - Core > Issue Type: Improvement > Components: core/index >Reporter: Namgyu Kim >Priority: Major > Attachments: LUCENE-8575.patch, LUCENE-8575.patch > > > I saw the following code in SegmentInfo class. > {code:java} > // TODO: we could append toString of attributes() here? > {code} > Of course, we can. > > So I wrote a code for that part. > {code:java} > public String toString(int delCount) { > StringBuilder s = new StringBuilder(); > s.append(name).append('(').append(version == null ? "?" : > version).append(')').append(':'); > char cfs = getUseCompoundFile() ? 'c' : 'C'; > s.append(cfs); > s.append(maxDoc); > if (delCount != 0) { > s.append('/').append(delCount); > } > if (indexSort != null) { > s.append(":[indexSort="); > s.append(indexSort); > s.append(']'); > } > // New Code > if (!diagnostics.isEmpty()) { > s.append(":[diagnostics="); > for (Map.Entry entry : diagnostics.entrySet()) > > s.append("<").append(entry.getKey()).append(",").append(entry.getValue()).append(">,"); > s.setLength(s.length() - 1); > s.append(']'); > } > // New Code > if (!attributes.isEmpty()) { > s.append(":[attributes="); > for (Map.Entry entry : attributes.entrySet()) > > s.append("<").append(entry.getKey()).append(",").append(entry.getValue()).append(">,"); > s.setLength(s.length() - 1); > s.append(']'); > } > return s.toString(); > } > {code} > -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-7745) Explore GPU acceleration
[ https://issues.apache.org/jira/browse/LUCENE-7745?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16703835#comment-16703835 ] Adrien Grand commented on LUCENE-7745: -- bq. How critical is the inverted index to the user experience? Completely: almost all queries run on the inverted index. Unlike other datastores that run queries via linear scans and allow to speed things up by building indices, Lucene only enables querying vian an index. bq. What happens if the inverted index is speeded up? Then most queries get a speed up too. bq. How many AWS instances would usually be used for searching through ~140GB sized inverted index Hard to tell, it depends on your search load, how expensive queries are, etc. > Explore GPU acceleration > > > Key: LUCENE-7745 > URL: https://issues.apache.org/jira/browse/LUCENE-7745 > Project: Lucene - Core > Issue Type: Improvement >Reporter: Ishan Chattopadhyaya >Assignee: Ishan Chattopadhyaya >Priority: Major > Labels: gsoc2017, mentor > Attachments: TermDisjunctionQuery.java, gpu-benchmarks.png > > > There are parts of Lucene that can potentially be speeded up if computations > were to be offloaded from CPU to the GPU(s). With commodity GPUs having as > high as 12GB of high bandwidth RAM, we might be able to leverage GPUs to > speed parts of Lucene (indexing, search). > First that comes to mind is spatial filtering, which is traditionally known > to be a good candidate for GPU based speedup (esp. when complex polygons are > involved). In the past, Mike McCandless has mentioned that "both initial > indexing and merging are CPU/IO intensive, but they are very amenable to > soaking up the hardware's concurrency." > I'm opening this issue as an exploratory task, suitable for a GSoC project. I > volunteer to mentor any GSoC student willing to work on this this summer. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-NightlyTests-7.6 - Build # 14 - Still Unstable
Build: https://builds.apache.org/job/Lucene-Solr-NightlyTests-7.6/14/ 4 tests failed. FAILED: org.apache.solr.cloud.autoscaling.IndexSizeTriggerTest.testSplitIntegration Error Message: did not finish processing in time Stack Trace: java.lang.AssertionError: did not finish processing in time at __randomizedtesting.SeedInfo.seed([6AD9937FED14CE69:53572A3FC2EB0797]:0) at org.junit.Assert.fail(Assert.java:93) at org.junit.Assert.assertTrue(Assert.java:43) at org.apache.solr.cloud.autoscaling.IndexSizeTriggerTest.testSplitIntegration(IndexSizeTriggerTest.java:320) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at java.lang.Thread.run(Thread.java:748) FAILED: org.apache.solr.cloud.hdfs.HdfsRestartWhileUpdatingTest.test Error Message: There are still nodes recoverying - waited for 320 seconds Stack Trace: java.lang.AssertionError: There are still nodes recoverying - waited for 320 seconds at
[jira] [Commented] (LUCENE-8542) Provide the LeafSlice to CollectorManager.newCollector to save memory on small index slices
[ https://issues.apache.org/jira/browse/LUCENE-8542?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16703818#comment-16703818 ] Adrien Grand commented on LUCENE-8542: -- This doesn't look like a great solution to your problem? If I were you, I'd probably rather fork TopScoreDocCollector to dynamically grow the heap. > Provide the LeafSlice to CollectorManager.newCollector to save memory on > small index slices > --- > > Key: LUCENE-8542 > URL: https://issues.apache.org/jira/browse/LUCENE-8542 > Project: Lucene - Core > Issue Type: Improvement > Components: core/search >Reporter: Christoph Kaser >Priority: Minor > Attachments: LUCENE-8542.patch > > > I have an index consisting of 44 million documents spread across 60 segments. > When I run a query against this index with a huge number of results requested > (e.g. 5 million), this query uses more than 5 GB of heap if the IndexSearch > was configured to use an ExecutorService. > (I know this kind of query is fairly unusual and it would be better to use > paging and searchAfter, but our architecture does not allow this at the > moment.) > The reason for the huge memory requirement is that the search [will create a > TopScoreDocCollector for each > segment|https://github.com/apache/lucene-solr/blob/master/lucene/core/src/java/org/apache/lucene/search/IndexSearcher.java#L404], > each one with numHits = 5 million. This is fine for the large segments, but > many of those segments are fairly small and only contain several thousand > documents. This wastes a huge amount of memory for queries with large values > of numHits on indices with many segments. > Therefore, I propose to change the CollectorManager - interface in the > following way: > * change the method newCollector to accept a parameter LeafSlice that can be > used to determine the total count of documents in the LeafSlice > * Maybe, in order to remain backwards compatible, it would be possible to > introduce this as a new method with a default implementation that calls the > old method - otherwise, it probably has to wait for Lucene 8? > * This can then be used to cap numHits for each TopScoreDocCollector to the > leafslice-size. > If this is something that would make sense for you, I can try to provide a > patch. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (LUCENE-8579) Don't run badapples when building a release
[ https://issues.apache.org/jira/browse/LUCENE-8579?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Adrien Grand updated LUCENE-8579: - Description: Nick pinged me because his attempt to build a release failed because of Solr test failures. When looking at those, I noticed that a number of them were known flaky test that are already tagged as bad apples, eg. org.apache.solr.cloud.TestStressInPlaceUpdates.stressTest. We should disable badapples in the script to build a release, ie. something like {code:python} diff --git a/dev-tools/scripts/buildAndPushRelease.py b/dev-tools/scripts/buildAndPushRelease.py index 5a8f5cc..b557da2 100644 --- a/dev-tools/scripts/buildAndPushRelease.py +++ b/dev-tools/scripts/buildAndPushRelease.py @@ -105,7 +105,7 @@ def prepare(root, version, gpgKeyID, gpgPassword): print(' Check DOAP files') checkDOAPfiles(version) - print(' ant clean test validate documentation-lint') + print(' ant -Dtests.badapples=false clean test validate documentation-lint') - run('ant clean test validate documentation-lint') + run('ant -Dtests.badapples=false clean test validate documentation-lint') open('rev.txt', mode='wb').write(rev.encode('UTF-8')) {code} was: Nick pinged me because his attempt to build a release failed because of Solr test failures. When looking at those, I noticed that a number of them were known flaky test that are already tagged as bad apples, eg. org.apache.solr.cloud.TestStressInPlaceUpdates.stressTest. We should disable badapples in the script to build a release, ie. something like {code:python} diff --git a/dev-tools/scripts/buildAndPushRelease.py b/dev-tools/scripts/buildAndPushRelease.py index 5a8f5cc..b557da2 100644 --- a/dev-tools/scripts/buildAndPushRelease.py +++ b/dev-tools/scripts/buildAndPushRelease.py @@ -105,7 +105,7 @@ def prepare(root, version, gpgKeyID, gpgPassword): print(' Check DOAP files') checkDOAPfiles(version) - print(' ant clean test validate documentation-lint') + print(' ant -Dtests.badapples=false clean test validate documentation-lint') run('ant clean test validate documentation-lint') open('rev.txt', mode='wb').write(rev.encode('UTF-8')) {code} > Don't run badapples when building a release > --- > > Key: LUCENE-8579 > URL: https://issues.apache.org/jira/browse/LUCENE-8579 > Project: Lucene - Core > Issue Type: Improvement >Reporter: Adrien Grand >Priority: Major > > Nick pinged me because his attempt to build a release failed because of Solr > test failures. When looking at those, I noticed that a number of them were > known flaky test that are already tagged as bad apples, eg. > org.apache.solr.cloud.TestStressInPlaceUpdates.stressTest. > We should disable badapples in the script to build a release, ie. something > like > {code:python} > diff --git a/dev-tools/scripts/buildAndPushRelease.py > b/dev-tools/scripts/buildAndPushRelease.py > index 5a8f5cc..b557da2 100644 > --- a/dev-tools/scripts/buildAndPushRelease.py > +++ b/dev-tools/scripts/buildAndPushRelease.py > @@ -105,7 +105,7 @@ def prepare(root, version, gpgKeyID, gpgPassword): >print(' Check DOAP files') >checkDOAPfiles(version) > > - print(' ant clean test validate documentation-lint') > + print(' ant -Dtests.badapples=false clean test validate > documentation-lint') > - run('ant clean test validate documentation-lint') > + run('ant -Dtests.badapples=false clean test validate documentation-lint') > >open('rev.txt', mode='wb').write(rev.encode('UTF-8')) > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-8579) Don't run badapples when building a release
[ https://issues.apache.org/jira/browse/LUCENE-8579?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16703794#comment-16703794 ] Jim Ferenczi commented on LUCENE-8579: -- +1 > Don't run badapples when building a release > --- > > Key: LUCENE-8579 > URL: https://issues.apache.org/jira/browse/LUCENE-8579 > Project: Lucene - Core > Issue Type: Improvement >Reporter: Adrien Grand >Priority: Major > > Nick pinged me because his attempt to build a release failed because of Solr > test failures. When looking at those, I noticed that a number of them were > known flaky test that are already tagged as bad apples, eg. > org.apache.solr.cloud.TestStressInPlaceUpdates.stressTest. > We should disable badapples in the script to build a release, ie. something > like > {code:python} > diff --git a/dev-tools/scripts/buildAndPushRelease.py > b/dev-tools/scripts/buildAndPushRelease.py > index 5a8f5cc..b557da2 100644 > --- a/dev-tools/scripts/buildAndPushRelease.py > +++ b/dev-tools/scripts/buildAndPushRelease.py > @@ -105,7 +105,7 @@ def prepare(root, version, gpgKeyID, gpgPassword): >print(' Check DOAP files') >checkDOAPfiles(version) > > - print(' ant clean test validate documentation-lint') > + print(' ant -Dtests.badapples=false clean test validate > documentation-lint') >run('ant clean test validate documentation-lint') > >open('rev.txt', mode='wb').write(rev.encode('UTF-8')) > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Created] (LUCENE-8579) Don't run badapples when building a release
Adrien Grand created LUCENE-8579: Summary: Don't run badapples when building a release Key: LUCENE-8579 URL: https://issues.apache.org/jira/browse/LUCENE-8579 Project: Lucene - Core Issue Type: Improvement Reporter: Adrien Grand Nick pinged me because his attempt to build a release failed because of Solr test failures. When looking at those, I noticed that a number of them were known flaky test that are already tagged as bad apples, eg. org.apache.solr.cloud.TestStressInPlaceUpdates.stressTest. We should disable badapples in the script to build a release, ie. something like {code:python} diff --git a/dev-tools/scripts/buildAndPushRelease.py b/dev-tools/scripts/buildAndPushRelease.py index 5a8f5cc..b557da2 100644 --- a/dev-tools/scripts/buildAndPushRelease.py +++ b/dev-tools/scripts/buildAndPushRelease.py @@ -105,7 +105,7 @@ def prepare(root, version, gpgKeyID, gpgPassword): print(' Check DOAP files') checkDOAPfiles(version) - print(' ant clean test validate documentation-lint') + print(' ant -Dtests.badapples=false clean test validate documentation-lint') run('ant clean test validate documentation-lint') open('rev.txt', mode='wb').write(rev.encode('UTF-8')) {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-http2-Solaris (64bit/jdk1.8.0) - Build # 1 - Failure!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-http2-Solaris/1/ Java: 64bit/jdk1.8.0 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC 1026 tests failed. FAILED: junit.framework.TestSuite.org.apache.solr.analysis.TestFoldingMultitermExtrasQuery Error Message: no conscrypt_openjdk_jni-sunos-x86_64 in java.library.path Stack Trace: java.lang.UnsatisfiedLinkError: no conscrypt_openjdk_jni-sunos-x86_64 in java.library.path at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867) at java.lang.Runtime.loadLibrary0(Runtime.java:870) at java.lang.System.loadLibrary(System.java:1122) at org.conscrypt.NativeLibraryUtil.loadLibrary(NativeLibraryUtil.java:54) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.conscrypt.NativeLibraryLoader$1.run(NativeLibraryLoader.java:297) at org.conscrypt.NativeLibraryLoader$1.run(NativeLibraryLoader.java:289) at java.security.AccessController.doPrivileged(Native Method) at org.conscrypt.NativeLibraryLoader.loadLibraryFromHelperClassloader(NativeLibraryLoader.java:289) at org.conscrypt.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:262) at org.conscrypt.NativeLibraryLoader.load(NativeLibraryLoader.java:162) at org.conscrypt.NativeLibraryLoader.loadFirstAvailable(NativeLibraryLoader.java:106) at org.conscrypt.NativeCryptoJni.init(NativeCryptoJni.java:50) at org.conscrypt.NativeCrypto.(NativeCrypto.java:65) at org.conscrypt.OpenSSLProvider.(OpenSSLProvider.java:60) at org.conscrypt.OpenSSLProvider.(OpenSSLProvider.java:53) at org.conscrypt.OpenSSLProvider.(OpenSSLProvider.java:49) at org.apache.solr.SolrTestCaseJ4.(SolrTestCaseJ4.java:201) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:348) at com.carrotsearch.randomizedtesting.RandomizedRunner$2.run(RandomizedRunner.java:621) Suppressed: java.lang.UnsatisfiedLinkError: no conscrypt_openjdk_jni-sunos-x86_64 in java.library.path at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867) at java.lang.Runtime.loadLibrary0(Runtime.java:870) at java.lang.System.loadLibrary(System.java:1122) at org.conscrypt.NativeLibraryUtil.loadLibrary(NativeLibraryUtil.java:54) at org.conscrypt.NativeLibraryLoader.loadLibraryFromCurrentClassloader(NativeLibraryLoader.java:318) at org.conscrypt.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:273) ... 11 more Suppressed: java.lang.UnsatisfiedLinkError: no conscrypt_openjdk_jni in java.library.path ... 24 more Suppressed: java.lang.UnsatisfiedLinkError: no conscrypt_openjdk_jni in java.library.path at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867) at java.lang.Runtime.loadLibrary0(Runtime.java:870) at java.lang.System.loadLibrary(System.java:1122) at org.conscrypt.NativeLibraryUtil.loadLibrary(NativeLibraryUtil.java:54) at org.conscrypt.NativeLibraryLoader.loadLibraryFromCurrentClassloader(NativeLibraryLoader.java:318) at org.conscrypt.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:273) ... 11 more Suppressed: java.lang.UnsatisfiedLinkError: no conscrypt in java.library.path ... 24 more Suppressed: java.lang.UnsatisfiedLinkError: no conscrypt in java.library.path at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867) at java.lang.Runtime.loadLibrary0(Runtime.java:870) at java.lang.System.loadLibrary(System.java:1122) at org.conscrypt.NativeLibraryUtil.loadLibrary(NativeLibraryUtil.java:54) at org.conscrypt.NativeLibraryLoader.loadLibraryFromCurrentClassloader(NativeLibraryLoader.java:318) at org.conscrypt.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:273) ... 11 more FAILED: junit.framework.TestSuite.org.apache.solr.schema.TestICUCollationField Error Message: no conscrypt_openjdk_jni-sunos-x86_64 in java.library.path Stack Trace: java.lang.UnsatisfiedLinkError: no conscrypt_openjdk_jni-sunos-x86_64 in java.library.path at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867) at java.lang.Runtime.loadLibrary0(Runtime.java:870) at java.lang.System.loadLibrary(System.java:1122) at org.conscrypt.NativeLibraryUtil.loadLibrary(NativeLibraryUtil.java:54) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[JENKINS] Lucene-Solr-http2-Linux (64bit/jdk-10.0.1) - Build # 12 - Still Failing!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-http2-Linux/12/ Java: 64bit/jdk-10.0.1 -XX:-UseCompressedOops -XX:+UseG1GC 2 tests failed. FAILED: org.apache.solr.client.solrj.impl.CloudSolrClientTest.testRouting Error Message: Error from server at http://127.0.0.1:33705/solr/collection1_shard2_replica_n3: Expected mime type application/octet-stream but got text/html. Error 404 Can not find: /solr/collection1_shard2_replica_n3/update HTTP ERROR 404 Problem accessing /solr/collection1_shard2_replica_n3/update. Reason: Can not find: /solr/collection1_shard2_replica_n3/updatehttp://eclipse.org/jetty;>Powered by Jetty:// 9.4.14.v20181114 Stack Trace: org.apache.solr.client.solrj.impl.CloudSolrClient$RouteException: Error from server at http://127.0.0.1:33705/solr/collection1_shard2_replica_n3: Expected mime type application/octet-stream but got text/html. Error 404 Can not find: /solr/collection1_shard2_replica_n3/update HTTP ERROR 404 Problem accessing /solr/collection1_shard2_replica_n3/update. Reason: Can not find: /solr/collection1_shard2_replica_n3/updatehttp://eclipse.org/jetty;>Powered by Jetty:// 9.4.14.v20181114 at __randomizedtesting.SeedInfo.seed([B53603AB78E14B11:77813FC37BA1BB69]:0) at org.apache.solr.client.solrj.impl.CloudSolrClient.directUpdate(CloudSolrClient.java:551) at org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1016) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:884) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:817) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:196) at org.apache.solr.client.solrj.request.UpdateRequest.commit(UpdateRequest.java:237) at org.apache.solr.client.solrj.impl.CloudSolrClientTest.testRouting(CloudSolrClientTest.java:269) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:564) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985) at org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:110) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891) at
[jira] [Reopened] (SOLR-13023) No clear guidance on how to delete jar from .system collection
[ https://issues.apache.org/jira/browse/SOLR-13023?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Cassandra Targett reopened SOLR-13023: -- I agree the docs should be improved - reopening this issue. It's also a duplicate of SOLR-12436 (linked now) which has the same solution that also didn't make it in to docs. This other issue points out a wrinkle with the default autocommit settings for the .system collection, so the best command would include doing a commit at the same time as the delete (as the comment here does). > No clear guidance on how to delete jar from .system collection > -- > > Key: SOLR-13023 > URL: https://issues.apache.org/jira/browse/SOLR-13023 > Project: Solr > Issue Type: Bug > Security Level: Public(Default Security Level. Issues are Public) > Components: blobstore, documentation >Affects Versions: 7.5 >Reporter: William >Priority: Major > > There is no clear guidance on how to actually delete a JAR that was placed in > the .system collection. I have literally tried every command under the sun > even the ones provided in support. Can someone please tell me how to actually > do a proper delete. All I can seem to do is just add new ones. > > {code:java} > curl -X POST -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/.system/' --data-binary '{"delete": > {"id":"jtds131"}}' > curl -X POST -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/ndm_test/update' --data-binary '{ > "delete": { "id":"jtds131/1" } > }' > curl -X DELETE -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/.system/blob/jtds131' {code} > > I just want to delete any of the following: > > {code:java} > curl https://solr.cloud.statcan.ca/solr/.system/blob\?omitHeader\=true > (docker-for-desktop/default) > { > "response":{"numFound":20,"start":0,"docs":[ > { > "id":"jtds131/7", > "md5":"dd0041fbc51b53613db675520bb1521e", > "blobName":"jtds131", > "version":7, > "timestamp":"2018-11-29T15:16:57.079Z", > "size":35}, > { > "id":"jtds131/6", > "md5":"e45bd303ff39f7daa7a57941b1ff0b85", > "blobName":"jtds131", > "version":6, > "timestamp":"2018-11-29T15:13:20.375Z", > "size":34}, > { > "id":"jtds131/5", > "md5":"4f2e52b6058950bf92a61ff64f747091", > "blobName":"jtds131", > "version":5, > "timestamp":"2018-11-29T15:12:17.120Z", > "size":36}, > { > "id":"jtds131/4", > "md5":"36cf4127af583553605f69c728bff016", > "blobName":"jtds131", > "version":4, > "timestamp":"2018-11-29T15:11:22.136Z", > "size":30}, > { > "id":"jtds131/3", > "md5":"3db86deb7b7f6b1fa1b9fa81b690e68c", > "blobName":"jtds131", > "version":3, > "timestamp":"2018-11-29T15:09:38.075Z", > "size":28}, > { > "id":"ojdbc/3", > "md5":"1ce62f6f367c356671791e4b29506196", > "blobName":"ojdbc", > "version":3, > "timestamp":"2018-11-29T14:45:47.213Z", > "size":24}, > { > "id":"jtds131/2", > "md5":"cac877bfcbd4cc5d4adc04497df81ab", > "blobName":"jtds131", > "version":2, > "timestamp":"2018-11-29T15:08:01.273Z", > "size":25}, > { > "id":"ojdbc/2", > "md5":"a65fc757841c108339744156717b403f", > "blobName":"ojdbc", > "version":2, > "timestamp":"2018-11-29T14:33:22.048Z", > "size":24}, > { > "id":"ndm_test/2", > "md5":"83c579317606eca7d06418cf83a2c0d0", > "blobName":"ndm_test", > "version":2, > "timestamp":"2018-11-29T15:45:24.267Z", > "size":26}, > { > "id":"jtds131/1", > "md5":"499eb508a21e312d3bb9aec337eb38b7", > "blobName":"jtds131", > "version":1, > "timestamp":"2018-11-28T20:11:44.533Z", > "size":317816}] > }}{code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (SOLR-13023) No clear guidance on how to delete jar from .system collection
[ https://issues.apache.org/jira/browse/SOLR-13023?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Cassandra Targett updated SOLR-13023: - Component/s: (was: SolrCloud) documentation blobstore > No clear guidance on how to delete jar from .system collection > -- > > Key: SOLR-13023 > URL: https://issues.apache.org/jira/browse/SOLR-13023 > Project: Solr > Issue Type: Bug > Security Level: Public(Default Security Level. Issues are Public) > Components: blobstore, documentation >Affects Versions: 7.5 >Reporter: William >Priority: Major > > There is no clear guidance on how to actually delete a JAR that was placed in > the .system collection. I have literally tried every command under the sun > even the ones provided in support. Can someone please tell me how to actually > do a proper delete. All I can seem to do is just add new ones. > > {code:java} > curl -X POST -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/.system/' --data-binary '{"delete": > {"id":"jtds131"}}' > curl -X POST -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/ndm_test/update' --data-binary '{ > "delete": { "id":"jtds131/1" } > }' > curl -X DELETE -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/.system/blob/jtds131' {code} > > I just want to delete any of the following: > > {code:java} > curl https://solr.cloud.statcan.ca/solr/.system/blob\?omitHeader\=true > (docker-for-desktop/default) > { > "response":{"numFound":20,"start":0,"docs":[ > { > "id":"jtds131/7", > "md5":"dd0041fbc51b53613db675520bb1521e", > "blobName":"jtds131", > "version":7, > "timestamp":"2018-11-29T15:16:57.079Z", > "size":35}, > { > "id":"jtds131/6", > "md5":"e45bd303ff39f7daa7a57941b1ff0b85", > "blobName":"jtds131", > "version":6, > "timestamp":"2018-11-29T15:13:20.375Z", > "size":34}, > { > "id":"jtds131/5", > "md5":"4f2e52b6058950bf92a61ff64f747091", > "blobName":"jtds131", > "version":5, > "timestamp":"2018-11-29T15:12:17.120Z", > "size":36}, > { > "id":"jtds131/4", > "md5":"36cf4127af583553605f69c728bff016", > "blobName":"jtds131", > "version":4, > "timestamp":"2018-11-29T15:11:22.136Z", > "size":30}, > { > "id":"jtds131/3", > "md5":"3db86deb7b7f6b1fa1b9fa81b690e68c", > "blobName":"jtds131", > "version":3, > "timestamp":"2018-11-29T15:09:38.075Z", > "size":28}, > { > "id":"ojdbc/3", > "md5":"1ce62f6f367c356671791e4b29506196", > "blobName":"ojdbc", > "version":3, > "timestamp":"2018-11-29T14:45:47.213Z", > "size":24}, > { > "id":"jtds131/2", > "md5":"cac877bfcbd4cc5d4adc04497df81ab", > "blobName":"jtds131", > "version":2, > "timestamp":"2018-11-29T15:08:01.273Z", > "size":25}, > { > "id":"ojdbc/2", > "md5":"a65fc757841c108339744156717b403f", > "blobName":"ojdbc", > "version":2, > "timestamp":"2018-11-29T14:33:22.048Z", > "size":24}, > { > "id":"ndm_test/2", > "md5":"83c579317606eca7d06418cf83a2c0d0", > "blobName":"ndm_test", > "version":2, > "timestamp":"2018-11-29T15:45:24.267Z", > "size":26}, > { > "id":"jtds131/1", > "md5":"499eb508a21e312d3bb9aec337eb38b7", > "blobName":"jtds131", > "version":1, > "timestamp":"2018-11-28T20:11:44.533Z", > "size":317816}] > }}{code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-13023) No clear guidance on how to delete jar from .system collection
[ https://issues.apache.org/jira/browse/SOLR-13023?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16703675#comment-16703675 ] Gus Heck commented on SOLR-13023: - Maybe don't close this so quick. Possibly the docs should be more explicit? > No clear guidance on how to delete jar from .system collection > -- > > Key: SOLR-13023 > URL: https://issues.apache.org/jira/browse/SOLR-13023 > Project: Solr > Issue Type: Bug > Security Level: Public(Default Security Level. Issues are Public) > Components: SolrCloud >Affects Versions: 7.5 >Reporter: William >Priority: Major > > There is no clear guidance on how to actually delete a JAR that was placed in > the .system collection. I have literally tried every command under the sun > even the ones provided in support. Can someone please tell me how to actually > do a proper delete. All I can seem to do is just add new ones. > > {code:java} > curl -X POST -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/.system/' --data-binary '{"delete": > {"id":"jtds131"}}' > curl -X POST -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/ndm_test/update' --data-binary '{ > "delete": { "id":"jtds131/1" } > }' > curl -X DELETE -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/.system/blob/jtds131' {code} > > I just want to delete any of the following: > > {code:java} > curl https://solr.cloud.statcan.ca/solr/.system/blob\?omitHeader\=true > (docker-for-desktop/default) > { > "response":{"numFound":20,"start":0,"docs":[ > { > "id":"jtds131/7", > "md5":"dd0041fbc51b53613db675520bb1521e", > "blobName":"jtds131", > "version":7, > "timestamp":"2018-11-29T15:16:57.079Z", > "size":35}, > { > "id":"jtds131/6", > "md5":"e45bd303ff39f7daa7a57941b1ff0b85", > "blobName":"jtds131", > "version":6, > "timestamp":"2018-11-29T15:13:20.375Z", > "size":34}, > { > "id":"jtds131/5", > "md5":"4f2e52b6058950bf92a61ff64f747091", > "blobName":"jtds131", > "version":5, > "timestamp":"2018-11-29T15:12:17.120Z", > "size":36}, > { > "id":"jtds131/4", > "md5":"36cf4127af583553605f69c728bff016", > "blobName":"jtds131", > "version":4, > "timestamp":"2018-11-29T15:11:22.136Z", > "size":30}, > { > "id":"jtds131/3", > "md5":"3db86deb7b7f6b1fa1b9fa81b690e68c", > "blobName":"jtds131", > "version":3, > "timestamp":"2018-11-29T15:09:38.075Z", > "size":28}, > { > "id":"ojdbc/3", > "md5":"1ce62f6f367c356671791e4b29506196", > "blobName":"ojdbc", > "version":3, > "timestamp":"2018-11-29T14:45:47.213Z", > "size":24}, > { > "id":"jtds131/2", > "md5":"cac877bfcbd4cc5d4adc04497df81ab", > "blobName":"jtds131", > "version":2, > "timestamp":"2018-11-29T15:08:01.273Z", > "size":25}, > { > "id":"ojdbc/2", > "md5":"a65fc757841c108339744156717b403f", > "blobName":"ojdbc", > "version":2, > "timestamp":"2018-11-29T14:33:22.048Z", > "size":24}, > { > "id":"ndm_test/2", > "md5":"83c579317606eca7d06418cf83a2c0d0", > "blobName":"ndm_test", > "version":2, > "timestamp":"2018-11-29T15:45:24.267Z", > "size":26}, > { > "id":"jtds131/1", > "md5":"499eb508a21e312d3bb9aec337eb38b7", > "blobName":"jtds131", > "version":1, > "timestamp":"2018-11-28T20:11:44.533Z", > "size":317816}] > }}{code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Resolved] (SOLR-13023) No clear guidance on how to delete jar from .system collection
[ https://issues.apache.org/jira/browse/SOLR-13023?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] William resolved SOLR-13023. Resolution: Fixed > No clear guidance on how to delete jar from .system collection > -- > > Key: SOLR-13023 > URL: https://issues.apache.org/jira/browse/SOLR-13023 > Project: Solr > Issue Type: Bug > Security Level: Public(Default Security Level. Issues are Public) > Components: SolrCloud >Affects Versions: 7.5 >Reporter: William >Priority: Major > > There is no clear guidance on how to actually delete a JAR that was placed in > the .system collection. I have literally tried every command under the sun > even the ones provided in support. Can someone please tell me how to actually > do a proper delete. All I can seem to do is just add new ones. > > {code:java} > curl -X POST -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/.system/' --data-binary '{"delete": > {"id":"jtds131"}}' > curl -X POST -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/ndm_test/update' --data-binary '{ > "delete": { "id":"jtds131/1" } > }' > curl -X DELETE -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/.system/blob/jtds131' {code} > > I just want to delete any of the following: > > {code:java} > curl https://solr.cloud.statcan.ca/solr/.system/blob\?omitHeader\=true > (docker-for-desktop/default) > { > "response":{"numFound":20,"start":0,"docs":[ > { > "id":"jtds131/7", > "md5":"dd0041fbc51b53613db675520bb1521e", > "blobName":"jtds131", > "version":7, > "timestamp":"2018-11-29T15:16:57.079Z", > "size":35}, > { > "id":"jtds131/6", > "md5":"e45bd303ff39f7daa7a57941b1ff0b85", > "blobName":"jtds131", > "version":6, > "timestamp":"2018-11-29T15:13:20.375Z", > "size":34}, > { > "id":"jtds131/5", > "md5":"4f2e52b6058950bf92a61ff64f747091", > "blobName":"jtds131", > "version":5, > "timestamp":"2018-11-29T15:12:17.120Z", > "size":36}, > { > "id":"jtds131/4", > "md5":"36cf4127af583553605f69c728bff016", > "blobName":"jtds131", > "version":4, > "timestamp":"2018-11-29T15:11:22.136Z", > "size":30}, > { > "id":"jtds131/3", > "md5":"3db86deb7b7f6b1fa1b9fa81b690e68c", > "blobName":"jtds131", > "version":3, > "timestamp":"2018-11-29T15:09:38.075Z", > "size":28}, > { > "id":"ojdbc/3", > "md5":"1ce62f6f367c356671791e4b29506196", > "blobName":"ojdbc", > "version":3, > "timestamp":"2018-11-29T14:45:47.213Z", > "size":24}, > { > "id":"jtds131/2", > "md5":"cac877bfcbd4cc5d4adc04497df81ab", > "blobName":"jtds131", > "version":2, > "timestamp":"2018-11-29T15:08:01.273Z", > "size":25}, > { > "id":"ojdbc/2", > "md5":"a65fc757841c108339744156717b403f", > "blobName":"ojdbc", > "version":2, > "timestamp":"2018-11-29T14:33:22.048Z", > "size":24}, > { > "id":"ndm_test/2", > "md5":"83c579317606eca7d06418cf83a2c0d0", > "blobName":"ndm_test", > "version":2, > "timestamp":"2018-11-29T15:45:24.267Z", > "size":26}, > { > "id":"jtds131/1", > "md5":"499eb508a21e312d3bb9aec337eb38b7", > "blobName":"jtds131", > "version":1, > "timestamp":"2018-11-28T20:11:44.533Z", > "size":317816}] > }}{code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-13023) No clear guidance on how to delete jar from .system collection
[ https://issues.apache.org/jira/browse/SOLR-13023?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16703664#comment-16703664 ] William commented on SOLR-13023: This did it: |curl 'https://solr.cloud.statcan.ca/solr/.system/update?commit=true' -H "Content-Type: application/json" -d '\{"delete": { "query":"id:jtds*" }}'| > No clear guidance on how to delete jar from .system collection > -- > > Key: SOLR-13023 > URL: https://issues.apache.org/jira/browse/SOLR-13023 > Project: Solr > Issue Type: Bug > Security Level: Public(Default Security Level. Issues are Public) > Components: SolrCloud >Affects Versions: 7.5 >Reporter: William >Priority: Major > > There is no clear guidance on how to actually delete a JAR that was placed in > the .system collection. I have literally tried every command under the sun > even the ones provided in support. Can someone please tell me how to actually > do a proper delete. All I can seem to do is just add new ones. > > {code:java} > curl -X POST -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/.system/' --data-binary '{"delete": > {"id":"jtds131"}}' > curl -X POST -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/ndm_test/update' --data-binary '{ > "delete": { "id":"jtds131/1" } > }' > curl -X DELETE -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/.system/blob/jtds131' {code} > > I just want to delete any of the following: > > {code:java} > curl https://solr.cloud.statcan.ca/solr/.system/blob\?omitHeader\=true > (docker-for-desktop/default) > { > "response":{"numFound":20,"start":0,"docs":[ > { > "id":"jtds131/7", > "md5":"dd0041fbc51b53613db675520bb1521e", > "blobName":"jtds131", > "version":7, > "timestamp":"2018-11-29T15:16:57.079Z", > "size":35}, > { > "id":"jtds131/6", > "md5":"e45bd303ff39f7daa7a57941b1ff0b85", > "blobName":"jtds131", > "version":6, > "timestamp":"2018-11-29T15:13:20.375Z", > "size":34}, > { > "id":"jtds131/5", > "md5":"4f2e52b6058950bf92a61ff64f747091", > "blobName":"jtds131", > "version":5, > "timestamp":"2018-11-29T15:12:17.120Z", > "size":36}, > { > "id":"jtds131/4", > "md5":"36cf4127af583553605f69c728bff016", > "blobName":"jtds131", > "version":4, > "timestamp":"2018-11-29T15:11:22.136Z", > "size":30}, > { > "id":"jtds131/3", > "md5":"3db86deb7b7f6b1fa1b9fa81b690e68c", > "blobName":"jtds131", > "version":3, > "timestamp":"2018-11-29T15:09:38.075Z", > "size":28}, > { > "id":"ojdbc/3", > "md5":"1ce62f6f367c356671791e4b29506196", > "blobName":"ojdbc", > "version":3, > "timestamp":"2018-11-29T14:45:47.213Z", > "size":24}, > { > "id":"jtds131/2", > "md5":"cac877bfcbd4cc5d4adc04497df81ab", > "blobName":"jtds131", > "version":2, > "timestamp":"2018-11-29T15:08:01.273Z", > "size":25}, > { > "id":"ojdbc/2", > "md5":"a65fc757841c108339744156717b403f", > "blobName":"ojdbc", > "version":2, > "timestamp":"2018-11-29T14:33:22.048Z", > "size":24}, > { > "id":"ndm_test/2", > "md5":"83c579317606eca7d06418cf83a2c0d0", > "blobName":"ndm_test", > "version":2, > "timestamp":"2018-11-29T15:45:24.267Z", > "size":26}, > { > "id":"jtds131/1", > "md5":"499eb508a21e312d3bb9aec337eb38b7", > "blobName":"jtds131", > "version":1, > "timestamp":"2018-11-28T20:11:44.533Z", > "size":317816}] > }}{code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-http2-MacOSX (64bit/jdk1.8.0) - Build # 3 - Still Failing!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-http2-MacOSX/3/ Java: 64bit/jdk1.8.0 -XX:+UseCompressedOops -XX:+UseG1GC 1 tests failed. FAILED: org.apache.solr.cloud.autoscaling.sim.TestSimPolicyCloud.testCreateCollectionAddReplica Error Message: Timeout waiting for collection to become active Live Nodes: [127.0.0.1:10001_solr, 127.0.0.1:10004_solr, 127.0.0.1:1_solr, 127.0.0.1:10002_solr, 127.0.0.1:10003_solr] Last available state: DocCollection(testCreateCollectionAddReplica//clusterstate.json/11)={ "replicationFactor":"1", "pullReplicas":"0", "router":{"name":"compositeId"}, "maxShardsPerNode":"1", "autoAddReplicas":"false", "nrtReplicas":"1", "tlogReplicas":"0", "autoCreated":"true", "policy":"c1", "shards":{"shard1":{ "replicas":{"core_node1":{ "core":"testCreateCollectionAddReplica_shard1_replica_n1", "SEARCHER.searcher.maxDoc":0, "SEARCHER.searcher.deletedDocs":0, "INDEX.sizeInBytes":10240, "node_name":"127.0.0.1:10002_solr", "state":"active", "type":"NRT", "INDEX.sizeInGB":9.5367431640625E-6, "SEARCHER.searcher.numDocs":0}}, "range":"8000-7fff", "state":"active"}}} Stack Trace: java.lang.AssertionError: Timeout waiting for collection to become active Live Nodes: [127.0.0.1:10001_solr, 127.0.0.1:10004_solr, 127.0.0.1:1_solr, 127.0.0.1:10002_solr, 127.0.0.1:10003_solr] Last available state: DocCollection(testCreateCollectionAddReplica//clusterstate.json/11)={ "replicationFactor":"1", "pullReplicas":"0", "router":{"name":"compositeId"}, "maxShardsPerNode":"1", "autoAddReplicas":"false", "nrtReplicas":"1", "tlogReplicas":"0", "autoCreated":"true", "policy":"c1", "shards":{"shard1":{ "replicas":{"core_node1":{ "core":"testCreateCollectionAddReplica_shard1_replica_n1", "SEARCHER.searcher.maxDoc":0, "SEARCHER.searcher.deletedDocs":0, "INDEX.sizeInBytes":10240, "node_name":"127.0.0.1:10002_solr", "state":"active", "type":"NRT", "INDEX.sizeInGB":9.5367431640625E-6, "SEARCHER.searcher.numDocs":0}}, "range":"8000-7fff", "state":"active"}}} at __randomizedtesting.SeedInfo.seed([98A20E2F09D55CC6:18826B011896B460]:0) at org.apache.solr.cloud.CloudTestUtils.waitForState(CloudTestUtils.java:70) at org.apache.solr.cloud.autoscaling.sim.TestSimPolicyCloud.testCreateCollectionAddReplica(TestSimPolicyCloud.java:123) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985) at org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:110) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891) at
[jira] [Commented] (SOLR-12801) Fix the tests, remove BadApples and AwaitsFix annotations, improve env for test development.
[ https://issues.apache.org/jira/browse/SOLR-12801?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16703607#comment-16703607 ] ASF subversion and git services commented on SOLR-12801: Commit 75b183196798232aa6f2dcb117f309119053 in lucene-solr's branch refs/heads/master from markrmiller [ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=75b1831 ] SOLR-12801: Make massive improvements to the tests. SOLR-12804: Remove static modifier from Overseer queue access. SOLR-12896: Introduce more checks for shutdown and closed to improve clean close and shutdown. (Partial) SOLR-12897: Introduce AlreadyClosedException to clean up silly close / shutdown logging. (Partial) SOLR-12898: Replace cluster state polling with ZkStateReader#waitFor. (Partial) SOLR-12923: The new AutoScaling tests are way too flaky and need special attention. (Partial) SOLR-12932: ant test (without badapples=false) should pass easily for developers. (Partial) SOLR-12933: Fix SolrCloud distributed commit. > Fix the tests, remove BadApples and AwaitsFix annotations, improve env for > test development. > > > Key: SOLR-12801 > URL: https://issues.apache.org/jira/browse/SOLR-12801 > Project: Solr > Issue Type: Task > Security Level: Public(Default Security Level. Issues are Public) >Reporter: Mark Miller >Assignee: Mark Miller >Priority: Critical > Time Spent: 1h > Remaining Estimate: 0h > > A single issue to counteract the single issue adding tons of annotations, the > continued addition of new flakey tests, and the continued addition of > flakiness to existing tests. > Lots more to come. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-12898) Replace cluster state polling with ZkStateReader#waitFor.
[ https://issues.apache.org/jira/browse/SOLR-12898?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16703611#comment-16703611 ] ASF subversion and git services commented on SOLR-12898: Commit 75b183196798232aa6f2dcb117f309119053 in lucene-solr's branch refs/heads/master from markrmiller [ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=75b1831 ] SOLR-12801: Make massive improvements to the tests. SOLR-12804: Remove static modifier from Overseer queue access. SOLR-12896: Introduce more checks for shutdown and closed to improve clean close and shutdown. (Partial) SOLR-12897: Introduce AlreadyClosedException to clean up silly close / shutdown logging. (Partial) SOLR-12898: Replace cluster state polling with ZkStateReader#waitFor. (Partial) SOLR-12923: The new AutoScaling tests are way too flaky and need special attention. (Partial) SOLR-12932: ant test (without badapples=false) should pass easily for developers. (Partial) SOLR-12933: Fix SolrCloud distributed commit. > Replace cluster state polling with ZkStateReader#waitFor. > - > > Key: SOLR-12898 > URL: https://issues.apache.org/jira/browse/SOLR-12898 > Project: Solr > Issue Type: Sub-task > Security Level: Public(Default Security Level. Issues are Public) >Reporter: Mark Miller >Assignee: Mark Miller >Priority: Major > > I've done a lot of this on the starburst branch and then again did some in a > related test issue. This issue is to finish comprehensibly and pull > everything from starburst around this in. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-12933) Fix SolrCloud distributed commit.
[ https://issues.apache.org/jira/browse/SOLR-12933?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16703614#comment-16703614 ] ASF subversion and git services commented on SOLR-12933: Commit 75b183196798232aa6f2dcb117f309119053 in lucene-solr's branch refs/heads/master from markrmiller [ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=75b1831 ] SOLR-12801: Make massive improvements to the tests. SOLR-12804: Remove static modifier from Overseer queue access. SOLR-12896: Introduce more checks for shutdown and closed to improve clean close and shutdown. (Partial) SOLR-12897: Introduce AlreadyClosedException to clean up silly close / shutdown logging. (Partial) SOLR-12898: Replace cluster state polling with ZkStateReader#waitFor. (Partial) SOLR-12923: The new AutoScaling tests are way too flaky and need special attention. (Partial) SOLR-12932: ant test (without badapples=false) should pass easily for developers. (Partial) SOLR-12933: Fix SolrCloud distributed commit. > Fix SolrCloud distributed commit. > - > > Key: SOLR-12933 > URL: https://issues.apache.org/jira/browse/SOLR-12933 > Project: Solr > Issue Type: Bug > Security Level: Public(Default Security Level. Issues are Public) >Reporter: Mark Miller >Assignee: Mark Miller >Priority: Major > > Well working on the starburst branch, I found that our commit stuff is still > whackey - it's just hard to tell depending on binary vs xml and what you are > doing. > We should pull in my fix for this. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-12804) Remove static modifier from Overseer queue access.
[ https://issues.apache.org/jira/browse/SOLR-12804?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16703608#comment-16703608 ] ASF subversion and git services commented on SOLR-12804: Commit 75b183196798232aa6f2dcb117f309119053 in lucene-solr's branch refs/heads/master from markrmiller [ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=75b1831 ] SOLR-12801: Make massive improvements to the tests. SOLR-12804: Remove static modifier from Overseer queue access. SOLR-12896: Introduce more checks for shutdown and closed to improve clean close and shutdown. (Partial) SOLR-12897: Introduce AlreadyClosedException to clean up silly close / shutdown logging. (Partial) SOLR-12898: Replace cluster state polling with ZkStateReader#waitFor. (Partial) SOLR-12923: The new AutoScaling tests are way too flaky and need special attention. (Partial) SOLR-12932: ant test (without badapples=false) should pass easily for developers. (Partial) SOLR-12933: Fix SolrCloud distributed commit. > Remove static modifier from Overseer queue access. > -- > > Key: SOLR-12804 > URL: https://issues.apache.org/jira/browse/SOLR-12804 > Project: Solr > Issue Type: Sub-task > Security Level: Public(Default Security Level. Issues are Public) >Reporter: Mark Miller >Assignee: Mark Miller >Priority: Major > > This is kind of a lazy anti-pattern in general, but also does not go well > with mocking. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-12896) Introduce more checks for shutdown and closed to improve clean close and shutdown.
[ https://issues.apache.org/jira/browse/SOLR-12896?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16703609#comment-16703609 ] ASF subversion and git services commented on SOLR-12896: Commit 75b183196798232aa6f2dcb117f309119053 in lucene-solr's branch refs/heads/master from markrmiller [ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=75b1831 ] SOLR-12801: Make massive improvements to the tests. SOLR-12804: Remove static modifier from Overseer queue access. SOLR-12896: Introduce more checks for shutdown and closed to improve clean close and shutdown. (Partial) SOLR-12897: Introduce AlreadyClosedException to clean up silly close / shutdown logging. (Partial) SOLR-12898: Replace cluster state polling with ZkStateReader#waitFor. (Partial) SOLR-12923: The new AutoScaling tests are way too flaky and need special attention. (Partial) SOLR-12932: ant test (without badapples=false) should pass easily for developers. (Partial) SOLR-12933: Fix SolrCloud distributed commit. > Introduce more checks for shutdown and closed to improve clean close and > shutdown. > -- > > Key: SOLR-12896 > URL: https://issues.apache.org/jira/browse/SOLR-12896 > Project: Solr > Issue Type: Sub-task > Security Level: Public(Default Security Level. Issues are Public) >Reporter: Mark Miller >Assignee: Mark Miller >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-12897) Introduce AlreadyClosedException to clean up silly close / shutdown logging.
[ https://issues.apache.org/jira/browse/SOLR-12897?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16703610#comment-16703610 ] ASF subversion and git services commented on SOLR-12897: Commit 75b183196798232aa6f2dcb117f309119053 in lucene-solr's branch refs/heads/master from markrmiller [ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=75b1831 ] SOLR-12801: Make massive improvements to the tests. SOLR-12804: Remove static modifier from Overseer queue access. SOLR-12896: Introduce more checks for shutdown and closed to improve clean close and shutdown. (Partial) SOLR-12897: Introduce AlreadyClosedException to clean up silly close / shutdown logging. (Partial) SOLR-12898: Replace cluster state polling with ZkStateReader#waitFor. (Partial) SOLR-12923: The new AutoScaling tests are way too flaky and need special attention. (Partial) SOLR-12932: ant test (without badapples=false) should pass easily for developers. (Partial) SOLR-12933: Fix SolrCloud distributed commit. > Introduce AlreadyClosedException to clean up silly close / shutdown logging. > > > Key: SOLR-12897 > URL: https://issues.apache.org/jira/browse/SOLR-12897 > Project: Solr > Issue Type: Sub-task > Security Level: Public(Default Security Level. Issues are Public) >Reporter: Mark Miller >Assignee: Mark Miller >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-12932) ant test (without badapples=false) should pass easily for developers.
[ https://issues.apache.org/jira/browse/SOLR-12932?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16703613#comment-16703613 ] ASF subversion and git services commented on SOLR-12932: Commit 75b183196798232aa6f2dcb117f309119053 in lucene-solr's branch refs/heads/master from markrmiller [ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=75b1831 ] SOLR-12801: Make massive improvements to the tests. SOLR-12804: Remove static modifier from Overseer queue access. SOLR-12896: Introduce more checks for shutdown and closed to improve clean close and shutdown. (Partial) SOLR-12897: Introduce AlreadyClosedException to clean up silly close / shutdown logging. (Partial) SOLR-12898: Replace cluster state polling with ZkStateReader#waitFor. (Partial) SOLR-12923: The new AutoScaling tests are way too flaky and need special attention. (Partial) SOLR-12932: ant test (without badapples=false) should pass easily for developers. (Partial) SOLR-12933: Fix SolrCloud distributed commit. > ant test (without badapples=false) should pass easily for developers. > - > > Key: SOLR-12932 > URL: https://issues.apache.org/jira/browse/SOLR-12932 > Project: Solr > Issue Type: Sub-task > Security Level: Public(Default Security Level. Issues are Public) > Components: Tests >Reporter: Mark Miller >Assignee: Mark Miller >Priority: Major > > If we fix the tests we will end up here anyway, but we can shortcut this. > Once I get my first patch in, anyone who mentions a test that fails locally > for them at any time (not jenkins), I will fix it. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-12923) The new AutoScaling tests are way to flaky and need special attention.
[ https://issues.apache.org/jira/browse/SOLR-12923?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16703612#comment-16703612 ] ASF subversion and git services commented on SOLR-12923: Commit 75b183196798232aa6f2dcb117f309119053 in lucene-solr's branch refs/heads/master from markrmiller [ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=75b1831 ] SOLR-12801: Make massive improvements to the tests. SOLR-12804: Remove static modifier from Overseer queue access. SOLR-12896: Introduce more checks for shutdown and closed to improve clean close and shutdown. (Partial) SOLR-12897: Introduce AlreadyClosedException to clean up silly close / shutdown logging. (Partial) SOLR-12898: Replace cluster state polling with ZkStateReader#waitFor. (Partial) SOLR-12923: The new AutoScaling tests are way too flaky and need special attention. (Partial) SOLR-12932: ant test (without badapples=false) should pass easily for developers. (Partial) SOLR-12933: Fix SolrCloud distributed commit. > The new AutoScaling tests are way to flaky and need special attention. > -- > > Key: SOLR-12923 > URL: https://issues.apache.org/jira/browse/SOLR-12923 > Project: Solr > Issue Type: Sub-task > Security Level: Public(Default Security Level. Issues are Public) > Components: Tests >Reporter: Mark Miller >Priority: Major > > I've already done some work here (not posted yet). We need to address this, > these tests are too new to fail so often and easily. > I want to add beasting to precommit (LUCENE-8545) to help prevent tests that > fail so easily from being committed. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-Tests-master - Build # 2965 - Still Unstable
Build: https://builds.apache.org/job/Lucene-Solr-Tests-master/2965/ 2 tests failed. FAILED: org.apache.solr.cloud.AliasIntegrationTest.test Error Message: Collection not found: testalias2 Stack Trace: org.apache.solr.common.SolrException: Collection not found: testalias2 at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:851) at org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:817) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:194) at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:983) at org.apache.solr.cloud.AliasIntegrationTest.searchSeveralWays(AliasIntegrationTest.java:579) at org.apache.solr.cloud.AliasIntegrationTest.searchSeveralWays(AliasIntegrationTest.java:573) at org.apache.solr.cloud.AliasIntegrationTest.test(AliasIntegrationTest.java:498) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at
[JENKINS] Lucene-Solr-Tests-7.6 - Build # 22 - Unstable
Build: https://builds.apache.org/job/Lucene-Solr-Tests-7.6/22/ 1 tests failed. FAILED: org.apache.solr.client.solrj.impl.CloudSolrClientTest.testRouting Error Message: Error from server at https://127.0.0.1:33559/solr/collection1_shard2_replica_n3: Expected mime type application/octet-stream but got text/html.Error 404 Can not find: /solr/collection1_shard2_replica_n3/update HTTP ERROR 404 Problem accessing /solr/collection1_shard2_replica_n3/update. Reason: Can not find: /solr/collection1_shard2_replica_n3/updatehttp://eclipse.org/jetty;>Powered by Jetty:// 9.4.11.v20180605 Stack Trace: org.apache.solr.client.solrj.impl.CloudSolrClient$RouteException: Error from server at https://127.0.0.1:33559/solr/collection1_shard2_replica_n3: Expected mime type application/octet-stream but got text/html. Error 404 Can not find: /solr/collection1_shard2_replica_n3/update HTTP ERROR 404 Problem accessing /solr/collection1_shard2_replica_n3/update. Reason: Can not find: /solr/collection1_shard2_replica_n3/updatehttp://eclipse.org/jetty;>Powered by Jetty:// 9.4.11.v20180605 at __randomizedtesting.SeedInfo.seed([8A2EE54B0993FF0B:4899D9230AD30F73]:0) at org.apache.solr.client.solrj.impl.CloudSolrClient.directUpdate(CloudSolrClient.java:551) at org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1016) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:884) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:817) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:194) at org.apache.solr.client.solrj.request.UpdateRequest.commit(UpdateRequest.java:237) at org.apache.solr.client.solrj.impl.CloudSolrClientTest.testRouting(CloudSolrClientTest.java:269) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985) at org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:110) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at
[jira] [Updated] (SOLR-13023) No clear guidance on how to delete jar from .system collection
[ https://issues.apache.org/jira/browse/SOLR-13023?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] William updated SOLR-13023: --- Description: There is no clear guidance on how to actually delete a JAR that was placed in the .system collection. I have literally tried every command under the sun even the ones provided in support. Can someone please tell me how to actually do a proper delete. All I can seem to do is just add new ones. {code:java} curl -X POST -H 'Content-Type: application/json' 'https://solr.cloud.statcan.ca/solr/.system/' --data-binary '{"delete": {"id":"jtds131"}}' curl -X POST -H 'Content-Type: application/json' 'https://solr.cloud.statcan.ca/solr/ndm_test/update' --data-binary '{ "delete": { "id":"jtds131/1" } }' curl -X DELETE -H 'Content-Type: application/json' 'https://solr.cloud.statcan.ca/solr/.system/blob/jtds131' {code} I just want to delete any of the following: {code:java} curl https://solr.cloud.statcan.ca/solr/.system/blob\?omitHeader\=true (docker-for-desktop/default) { "response":{"numFound":20,"start":0,"docs":[ { "id":"jtds131/7", "md5":"dd0041fbc51b53613db675520bb1521e", "blobName":"jtds131", "version":7, "timestamp":"2018-11-29T15:16:57.079Z", "size":35}, { "id":"jtds131/6", "md5":"e45bd303ff39f7daa7a57941b1ff0b85", "blobName":"jtds131", "version":6, "timestamp":"2018-11-29T15:13:20.375Z", "size":34}, { "id":"jtds131/5", "md5":"4f2e52b6058950bf92a61ff64f747091", "blobName":"jtds131", "version":5, "timestamp":"2018-11-29T15:12:17.120Z", "size":36}, { "id":"jtds131/4", "md5":"36cf4127af583553605f69c728bff016", "blobName":"jtds131", "version":4, "timestamp":"2018-11-29T15:11:22.136Z", "size":30}, { "id":"jtds131/3", "md5":"3db86deb7b7f6b1fa1b9fa81b690e68c", "blobName":"jtds131", "version":3, "timestamp":"2018-11-29T15:09:38.075Z", "size":28}, { "id":"ojdbc/3", "md5":"1ce62f6f367c356671791e4b29506196", "blobName":"ojdbc", "version":3, "timestamp":"2018-11-29T14:45:47.213Z", "size":24}, { "id":"jtds131/2", "md5":"cac877bfcbd4cc5d4adc04497df81ab", "blobName":"jtds131", "version":2, "timestamp":"2018-11-29T15:08:01.273Z", "size":25}, { "id":"ojdbc/2", "md5":"a65fc757841c108339744156717b403f", "blobName":"ojdbc", "version":2, "timestamp":"2018-11-29T14:33:22.048Z", "size":24}, { "id":"ndm_test/2", "md5":"83c579317606eca7d06418cf83a2c0d0", "blobName":"ndm_test", "version":2, "timestamp":"2018-11-29T15:45:24.267Z", "size":26}, { "id":"jtds131/1", "md5":"499eb508a21e312d3bb9aec337eb38b7", "blobName":"jtds131", "version":1, "timestamp":"2018-11-28T20:11:44.533Z", "size":317816}] }}{code} was: There is no clear guidance on how to actually delete a JAR that was placed in the .system collection. I have literally tried every command under the sun even the ones provided in support. Can someone please tell me how to actually do a proper delete. All I can seem to do is just add new ones. {code:java} curl -X POST -H 'Content-Type: application/json' 'https://solr.cloud.statcan.ca/solr/.system/' --data-binary '{"delete": {"id":"jtds131"}}' curl -X POST -H 'Content-Type: application/json' 'https://solr.cloud.statcan.ca/solr/ndm_test/update' --data-binary '{ "delete": { "id":"jtds131/1" } }' curl -X DELETE -H 'Content-Type: application/json' 'https://solr.cloud.statcan.ca/solr/.system/blob/jtds131' {code} > No clear guidance on how to delete jar from .system collection > -- > > Key: SOLR-13023 > URL: https://issues.apache.org/jira/browse/SOLR-13023 > Project: Solr > Issue Type: Bug > Security Level: Public(Default Security Level. Issues are Public) > Components: SolrCloud >Affects Versions: 7.5 >Reporter: William >Priority: Major > > There is no clear guidance on how to actually delete a JAR that was placed in > the .system collection. I have literally tried every command under the sun > even the ones provided in support. Can someone please tell me how to actually > do a proper delete. All I can seem to do is just add new ones. > > {code:java} > curl -X POST -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/.system/' --data-binary '{"delete": > {"id":"jtds131"}}' > curl -X POST -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/ndm_test/update' --data-binary '{ > "delete": { "id":"jtds131/1" } > }' > curl -X DELETE -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/.system/blob/jtds131' {code} > > I just want to delete any of the following: > > {code:java} > curl https://solr.cloud.statcan.ca/solr/.system/blob\?omitHeader\=true > (docker-for-desktop/default) > { > "response":{"numFound":20,"start":0,"docs":[ > { > "id":"jtds131/7", > "md5":"dd0041fbc51b53613db675520bb1521e", > "blobName":"jtds131", > "version":7, > "timestamp":"2018-11-29T15:16:57.079Z", >
[JENKINS-EA] Lucene-Solr-7.6-Linux (64bit/jdk-12-ea+12) - Build # 41 - Still Unstable!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.6-Linux/41/ Java: 64bit/jdk-12-ea+12 -XX:-UseCompressedOops -XX:+UseG1GC 24 tests failed. FAILED: org.apache.solr.client.solrj.impl.CloudSolrClientTest.preferLocalShardsTest Error Message: Response was not received from shards on a single node Stack Trace: java.lang.AssertionError: Response was not received from shards on a single node at __randomizedtesting.SeedInfo.seed([26BE1DB4198E0431:DA73850DE75993CB]:0) at org.junit.Assert.fail(Assert.java:93) at org.junit.Assert.assertTrue(Assert.java:43) at org.apache.solr.client.solrj.impl.CloudSolrClientTest.queryWithShardsPreferenceRules(CloudSolrClientTest.java:475) at org.apache.solr.client.solrj.impl.CloudSolrClientTest.preferLocalShardsTest(CloudSolrClientTest.java:424) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985) at org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:110) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at java.base/java.lang.Thread.run(Thread.java:835) FAILED:
[jira] [Updated] (SOLR-13023) No clear guidance on how to delete jar from .system collection
[ https://issues.apache.org/jira/browse/SOLR-13023?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] William updated SOLR-13023: --- Description: There is no clear guidance on how to actually delete a JAR that was placed in the .system collection. I have literally tried every command under the sun even the ones provided in support. Can someone please tell me how to actually do a proper delete. All I can seem to do is just add new ones. {code:java} curl -X POST -H 'Content-Type: application/json' 'https://solr.cloud.statcan.ca/solr/.system/' --data-binary '{"delete": {"id":"jtds131"}}' curl -X POST -H 'Content-Type: application/json' 'https://solr.cloud.statcan.ca/solr/ndm_test/update' --data-binary '{ "delete": { "id":"jtds131/1" } }' curl -X DELETE -H 'Content-Type: application/json' 'https://solr.cloud.statcan.ca/solr/.system/blob/jtds131' {code} was: There is no clear guidance on how to actually delete a JAR that was placed in the .system collection. I have literally tried every command under the sun even the ones provided in support. Can someone please tell me how to actually do a proper delete. All I can seem to do is just add new ones. {code:java} curl -X POST -H 'Content-Type: application/json' 'https://solr.cloud.statcan.ca/solr/.system/' --data-binary '{"delete": {"id":"jtds131"}}' curl -X POST -H 'Content-Type: application/json' 'https://solr.cloud.statcan.ca/solr/ndm_test/update' --data-binary '{ "delete": { "id":"jtds131/1" } }'{code} > No clear guidance on how to delete jar from .system collection > -- > > Key: SOLR-13023 > URL: https://issues.apache.org/jira/browse/SOLR-13023 > Project: Solr > Issue Type: Bug > Security Level: Public(Default Security Level. Issues are Public) > Components: SolrCloud >Affects Versions: 7.5 >Reporter: William >Priority: Major > > There is no clear guidance on how to actually delete a JAR that was placed in > the .system collection. I have literally tried every command under the sun > even the ones provided in support. Can someone please tell me how to actually > do a proper delete. All I can seem to do is just add new ones. > > {code:java} > curl -X POST -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/.system/' --data-binary '{"delete": > {"id":"jtds131"}}' > curl -X POST -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/ndm_test/update' --data-binary '{ > "delete": { "id":"jtds131/1" } > }' > curl -X DELETE -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/.system/blob/jtds131' {code} > -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-13023) No clear guidance on how to delete jar from .system collection
[ https://issues.apache.org/jira/browse/SOLR-13023?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16703384#comment-16703384 ] William commented on SOLR-13023: Also: "When using the blob store, note that the API does not delete or overwrite a previous object if a new one is uploaded with the same name. It always adds a new version of the blob to the index. Deletes can be performed with standard REST delete commands." This is completely false, curl -X delete does not work. > No clear guidance on how to delete jar from .system collection > -- > > Key: SOLR-13023 > URL: https://issues.apache.org/jira/browse/SOLR-13023 > Project: Solr > Issue Type: Bug > Security Level: Public(Default Security Level. Issues are Public) > Components: SolrCloud >Affects Versions: 7.5 >Reporter: William >Priority: Major > > There is no clear guidance on how to actually delete a JAR that was placed in > the .system collection. I have literally tried every command under the sun > even the ones provided in support. Can someone please tell me how to actually > do a proper delete. All I can seem to do is just add new ones. > > {code:java} > curl -X POST -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/.system/' --data-binary '{"delete": > {"id":"jtds131"}}' > curl -X POST -H 'Content-Type: application/json' > 'https://solr.cloud.statcan.ca/solr/ndm_test/update' --data-binary '{ > "delete": { "id":"jtds131/1" } > }'{code} > -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-8563) Remove k1+1 from the numerator of BM25Similarity
[ https://issues.apache.org/jira/browse/LUCENE-8563?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16703382#comment-16703382 ] Luca Cavanna commented on LUCENE-8563: -- I updated the PR according to the latest comments, and deprecated the newly introduced similarity like Robert suggested. > Remove k1+1 from the numerator of BM25Similarity > - > > Key: LUCENE-8563 > URL: https://issues.apache.org/jira/browse/LUCENE-8563 > Project: Lucene - Core > Issue Type: Improvement >Reporter: Adrien Grand >Priority: Minor > Time Spent: 40m > Remaining Estimate: 0h > > Our current implementation of BM25 does > {code:java} > boost * IDF * (k1+1) * tf / (tf + norm) > {code} > As (k1+1) is a constant, it is the same for every term and doesn't modify > ordering. It is often omitted and I found out that the "The Probabilistic > Relevance Framework: BM25 and Beyond" paper by Robertson (BM25's author) and > Zaragova even describes adding (k1+1) to the numerator as a variant whose > benefit is to be more comparable with Robertson/Sparck-Jones weighting, which > we don't care about. > {quote}A common variant is to add a (k1 + 1) component to the > numerator of the saturation function. This is the same for all > terms, and therefore does not affect the ranking produced. > The reason for including it was to make the final formula > more compatible with the RSJ weight used on its own > {quote} > Should we remove it from BM25Similarity as well? > A side-effect that I'm interested in is that integrating other score > contributions (eg. via oal.document.FeatureField) would be a bit easier to > reason about. For instance a weight of 3 in FeatureField#newSaturationQuery > would have a similar impact as a term whose IDF is 3 (and thus docFreq ~= 5%) > rather than a term whose IDF is 3/(k1 + 1). -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Created] (SOLR-13023) No clear guidance on how to delete jar from .system collection
William created SOLR-13023: -- Summary: No clear guidance on how to delete jar from .system collection Key: SOLR-13023 URL: https://issues.apache.org/jira/browse/SOLR-13023 Project: Solr Issue Type: Bug Security Level: Public (Default Security Level. Issues are Public) Components: SolrCloud Affects Versions: 7.5 Reporter: William There is no clear guidance on how to actually delete a JAR that was placed in the .system collection. I have literally tried every command under the sun even the ones provided in support. Can someone please tell me how to actually do a proper delete. All I can seem to do is just add new ones. {code:java} curl -X POST -H 'Content-Type: application/json' 'https://solr.cloud.statcan.ca/solr/.system/' --data-binary '{"delete": {"id":"jtds131"}}' curl -X POST -H 'Content-Type: application/json' 'https://solr.cloud.statcan.ca/solr/ndm_test/update' --data-binary '{ "delete": { "id":"jtds131/1" } }'{code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Created] (SOLR-13022) JSON Faceting NPE & 500 Error when attempting to sort on non-existent agg (ie: typo)
Hoss Man created SOLR-13022: --- Summary: JSON Faceting NPE & 500 Error when attempting to sort on non-existent agg (ie: typo) Key: SOLR-13022 URL: https://issues.apache.org/jira/browse/SOLR-13022 Project: Solr Issue Type: Bug Security Level: Public (Default Security Level. Issues are Public) Reporter: Hoss Man Assignee: Hoss Man JSON Faceting does not provide good error handling in the event of a typo / non-existent agg stat name when sorting. the parsing logic happily accepts the bogus sortVariable w/o validation, and the subsequent execution logic trows an NPE when it can't be found. Request... {noformat} curl http://localhost:8983/solr/techproducts/query -d 'rows=0=*:*& json.facet={ categories:{ type : terms, field : cat, sort : "x desc", } }' {noformat} Response... {noformat} { "responseHeader":{ "status":500, "QTime":1, "params":{ "q":"*:*", "json.facet":"{\n categories:{\ntype : terms,\nfield : cat,\n sort : \"x desc\",\n }\n}", "rows":"0"}}, "response":{"numFound":32,"start":0,"docs":[] }, "error":{ "trace":"java.lang.NullPointerException\n\tat org.apache.solr.search.facet.FacetFieldProcessor.lambda$findTopSlots$1(FacetFieldProcessor.java:287) ... {noformat} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-master-Solaris (64bit/jdk1.8.0) - Build # 2185 - Unstable!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Solaris/2185/ Java: 64bit/jdk1.8.0 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC 2 tests failed. FAILED: org.apache.solr.client.solrj.impl.CloudSolrClientTest.testVersionsAreReturned Error Message: Error from server at http://127.0.0.1:44676/solr/collection1_shard2_replica_n2: Expected mime type application/octet-stream but got text/html. Error 404 Can not find: /solr/collection1_shard2_replica_n2/update HTTP ERROR 404 Problem accessing /solr/collection1_shard2_replica_n2/update. Reason: Can not find: /solr/collection1_shard2_replica_n2/updatehttp://eclipse.org/jetty;>Powered by Jetty:// 9.4.11.v20180605 Stack Trace: org.apache.solr.client.solrj.impl.CloudSolrClient$RouteException: Error from server at http://127.0.0.1:44676/solr/collection1_shard2_replica_n2: Expected mime type application/octet-stream but got text/html. Error 404 Can not find: /solr/collection1_shard2_replica_n2/update HTTP ERROR 404 Problem accessing /solr/collection1_shard2_replica_n2/update. Reason: Can not find: /solr/collection1_shard2_replica_n2/updatehttp://eclipse.org/jetty;>Powered by Jetty:// 9.4.11.v20180605 at __randomizedtesting.SeedInfo.seed([39B340CC051FDD05:C175B9F9960645CD]:0) at org.apache.solr.client.solrj.impl.CloudSolrClient.directUpdate(CloudSolrClient.java:551) at org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1016) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:884) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:817) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:194) at org.apache.solr.client.solrj.request.UpdateRequest.commit(UpdateRequest.java:237) at org.apache.solr.client.solrj.impl.CloudSolrClientTest.testVersionsAreReturned(CloudSolrClientTest.java:725) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985) at org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:110) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891) at
[jira] [Commented] (LUCENE-8542) Provide the LeafSlice to CollectorManager.newCollector to save memory on small index slices
[ https://issues.apache.org/jira/browse/LUCENE-8542?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16703283#comment-16703283 ] Christoph Kaser commented on LUCENE-8542: - I attached a patch - hopefully this shows more clearly what I meant. Since CollectorManager is marked as experimental, I think it might be possible to port this patch against Lucene 7 as well without providing a default implementation of the new method and keeping the old method. > Provide the LeafSlice to CollectorManager.newCollector to save memory on > small index slices > --- > > Key: LUCENE-8542 > URL: https://issues.apache.org/jira/browse/LUCENE-8542 > Project: Lucene - Core > Issue Type: Improvement > Components: core/search >Reporter: Christoph Kaser >Priority: Minor > Attachments: LUCENE-8542.patch > > > I have an index consisting of 44 million documents spread across 60 segments. > When I run a query against this index with a huge number of results requested > (e.g. 5 million), this query uses more than 5 GB of heap if the IndexSearch > was configured to use an ExecutorService. > (I know this kind of query is fairly unusual and it would be better to use > paging and searchAfter, but our architecture does not allow this at the > moment.) > The reason for the huge memory requirement is that the search [will create a > TopScoreDocCollector for each > segment|https://github.com/apache/lucene-solr/blob/master/lucene/core/src/java/org/apache/lucene/search/IndexSearcher.java#L404], > each one with numHits = 5 million. This is fine for the large segments, but > many of those segments are fairly small and only contain several thousand > documents. This wastes a huge amount of memory for queries with large values > of numHits on indices with many segments. > Therefore, I propose to change the CollectorManager - interface in the > following way: > * change the method newCollector to accept a parameter LeafSlice that can be > used to determine the total count of documents in the LeafSlice > * Maybe, in order to remain backwards compatible, it would be possible to > introduce this as a new method with a default implementation that calls the > old method - otherwise, it probably has to wait for Lucene 8? > * This can then be used to cap numHits for each TopScoreDocCollector to the > leafslice-size. > If this is something that would make sense for you, I can try to provide a > patch. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (LUCENE-8542) Provide the LeafSlice to CollectorManager.newCollector to save memory on small index slices
[ https://issues.apache.org/jira/browse/LUCENE-8542?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Christoph Kaser updated LUCENE-8542: Attachment: LUCENE-8542.patch > Provide the LeafSlice to CollectorManager.newCollector to save memory on > small index slices > --- > > Key: LUCENE-8542 > URL: https://issues.apache.org/jira/browse/LUCENE-8542 > Project: Lucene - Core > Issue Type: Improvement > Components: core/search >Reporter: Christoph Kaser >Priority: Minor > Attachments: LUCENE-8542.patch > > > I have an index consisting of 44 million documents spread across 60 segments. > When I run a query against this index with a huge number of results requested > (e.g. 5 million), this query uses more than 5 GB of heap if the IndexSearch > was configured to use an ExecutorService. > (I know this kind of query is fairly unusual and it would be better to use > paging and searchAfter, but our architecture does not allow this at the > moment.) > The reason for the huge memory requirement is that the search [will create a > TopScoreDocCollector for each > segment|https://github.com/apache/lucene-solr/blob/master/lucene/core/src/java/org/apache/lucene/search/IndexSearcher.java#L404], > each one with numHits = 5 million. This is fine for the large segments, but > many of those segments are fairly small and only contain several thousand > documents. This wastes a huge amount of memory for queries with large values > of numHits on indices with many segments. > Therefore, I propose to change the CollectorManager - interface in the > following way: > * change the method newCollector to accept a parameter LeafSlice that can be > used to determine the total count of documents in the LeafSlice > * Maybe, in order to remain backwards compatible, it would be possible to > introduce this as a new method with a default implementation that calls the > old method - otherwise, it probably has to wait for Lucene 8? > * This can then be used to cap numHits for each TopScoreDocCollector to the > leafslice-size. > If this is something that would make sense for you, I can try to provide a > patch. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-http2-Linux (64bit/jdk-11) - Build # 11 - Still Failing!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-http2-Linux/11/ Java: 64bit/jdk-11 -XX:-UseCompressedOops -XX:+UseConcMarkSweepGC All tests passed Build Log: [...truncated 2023 lines...] [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/build/core/test/temp/junit4-J0-20181129_130431_91315831343039996275408.syserr [junit4] >>> JVM J0 emitted unexpected output (verbatim) [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release. [junit4] <<< JVM J0: EOF [...truncated 5 lines...] [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/build/core/test/temp/junit4-J2-20181129_130431_8952987921844327668586.syserr [junit4] >>> JVM J2 emitted unexpected output (verbatim) [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release. [junit4] <<< JVM J2: EOF [...truncated 3 lines...] [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/build/core/test/temp/junit4-J1-20181129_130431_8958895093225765628333.syserr [junit4] >>> JVM J1 emitted unexpected output (verbatim) [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release. [junit4] <<< JVM J1: EOF [...truncated 301 lines...] [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/build/test-framework/test/temp/junit4-J1-20181129_131359_41613776261556076641410.syserr [junit4] >>> JVM J1 emitted unexpected output (verbatim) [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release. [junit4] <<< JVM J1: EOF [...truncated 3 lines...] [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/build/test-framework/test/temp/junit4-J0-20181129_131359_41614002896429947228186.syserr [junit4] >>> JVM J0 emitted unexpected output (verbatim) [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release. [junit4] <<< JVM J0: EOF [...truncated 3 lines...] [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/build/test-framework/test/temp/junit4-J2-20181129_131359_41617433307471155403955.syserr [junit4] >>> JVM J2 emitted unexpected output (verbatim) [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release. [junit4] <<< JVM J2: EOF [...truncated 1080 lines...] [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/build/analysis/common/test/temp/junit4-J2-20181129_131532_3224598302639341697122.syserr [junit4] >>> JVM J2 emitted unexpected output (verbatim) [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release. [junit4] <<< JVM J2: EOF [...truncated 3 lines...] [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/build/analysis/common/test/temp/junit4-J0-20181129_131532_3082604892988842654917.syserr [junit4] >>> JVM J0 emitted unexpected output (verbatim) [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release. [junit4] <<< JVM J0: EOF [...truncated 3 lines...] [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/build/analysis/common/test/temp/junit4-J1-20181129_131532_3213387831638698268232.syserr [junit4] >>> JVM J1 emitted unexpected output (verbatim) [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release. [junit4] <<< JVM J1: EOF [...truncated 258 lines...] [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/build/analysis/icu/test/temp/junit4-J0-20181129_131712_0775053610704546729904.syserr [junit4] >>> JVM J0 emitted unexpected output (verbatim) [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release. [junit4] <<< JVM J0: EOF [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-http2-Linux/lucene/build/analysis/icu/test/temp/junit4-J2-20181129_131712_0788575736499620368012.syserr
[JENKINS] Lucene-Solr-repro - Build # 2043 - Unstable
Build: https://builds.apache.org/job/Lucene-Solr-repro/2043/ [...truncated 28 lines...] [repro] Jenkins log URL: https://builds.apache.org/job/Lucene-Solr-BadApples-NightlyTests-7.x/37/consoleText [repro] Revision: cf92418711dfe16862b66f2c14e176ac1697d3fc [repro] Ant options: -Dtests.multiplier=2 -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-NightlyTests-7.x/test-data/enwiki.random.lines.txt [repro] Repro line: ant test -Dtestcase=TestSimTriggerIntegration -Dtests.method=testEventQueue -Dtests.seed=BB5D3569FE6D602A -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.badapples=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-NightlyTests-7.x/test-data/enwiki.random.lines.txt -Dtests.locale=fr-CA -Dtests.timezone=Antarctica/Palmer -Dtests.asserts=true -Dtests.file.encoding=ISO-8859-1 [repro] Repro line: ant test -Dtestcase=CloudSolrClientTest -Dtests.method=testRouting -Dtests.seed=76ECABC04EB9D225 -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.badapples=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-NightlyTests-7.x/test-data/enwiki.random.lines.txt -Dtests.locale=sr-Latn-BA -Dtests.timezone=Pacific/Yap -Dtests.asserts=true -Dtests.file.encoding=ISO-8859-1 [repro] git rev-parse --abbrev-ref HEAD [repro] git rev-parse HEAD [repro] Initial local git branch/revision: 81c092d8262a68dfda3994e790f2e1f3fdf275e2 [repro] git fetch [repro] git checkout cf92418711dfe16862b66f2c14e176ac1697d3fc [...truncated 2 lines...] [repro] git merge --ff-only [...truncated 1 lines...] [repro] ant clean [...truncated 6 lines...] [repro] Test suites by module: [repro]solr/core [repro] TestSimTriggerIntegration [repro]solr/solrj [repro] CloudSolrClientTest [repro] ant compile-test [...truncated 3587 lines...] [repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=5 -Dtests.class="*.TestSimTriggerIntegration" -Dtests.showOutput=onerror -Dtests.multiplier=2 -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-NightlyTests-7.x/test-data/enwiki.random.lines.txt -Dtests.seed=BB5D3569FE6D602A -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.badapples=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-NightlyTests-7.x/test-data/enwiki.random.lines.txt -Dtests.locale=fr-CA -Dtests.timezone=Antarctica/Palmer -Dtests.asserts=true -Dtests.file.encoding=ISO-8859-1 [...truncated 3681 lines...] [repro] Setting last failure code to 256 [repro] ant compile-test [...truncated 454 lines...] [repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=5 -Dtests.class="*.CloudSolrClientTest" -Dtests.showOutput=onerror -Dtests.multiplier=2 -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-NightlyTests-7.x/test-data/enwiki.random.lines.txt -Dtests.seed=76ECABC04EB9D225 -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.badapples=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-NightlyTests-7.x/test-data/enwiki.random.lines.txt -Dtests.locale=sr-Latn-BA -Dtests.timezone=Pacific/Yap -Dtests.asserts=true -Dtests.file.encoding=ISO-8859-1 [...truncated 143 lines...] [repro] Failures: [repro] 0/5 failed: org.apache.solr.client.solrj.impl.CloudSolrClientTest [repro] 1/5 failed: org.apache.solr.cloud.autoscaling.sim.TestSimTriggerIntegration [repro] git checkout 81c092d8262a68dfda3994e790f2e1f3fdf275e2 [...truncated 2 lines...] [repro] Exiting with code 256 [...truncated 6 lines...] - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-NightlyTests-master - Build # 1709 - Failure
Build: https://builds.apache.org/job/Lucene-Solr-NightlyTests-master/1709/ 2 tests failed. FAILED: org.apache.lucene.document.TestLatLonLineShapeQueries.testRandomBig Error Message: Java heap space Stack Trace: java.lang.OutOfMemoryError: Java heap space at __randomizedtesting.SeedInfo.seed([EA127355C5DCC27D:6D450EDA5485BEFD]:0) at org.apache.lucene.store.RAMFile.newBuffer(RAMFile.java:84) at org.apache.lucene.store.RAMFile.addBuffer(RAMFile.java:57) at org.apache.lucene.store.RAMOutputStream.switchCurrentBuffer(RAMOutputStream.java:168) at org.apache.lucene.store.RAMOutputStream.writeBytes(RAMOutputStream.java:154) at org.apache.lucene.store.MockIndexOutputWrapper.writeBytes(MockIndexOutputWrapper.java:141) at org.apache.lucene.util.bkd.OfflinePointWriter.append(OfflinePointWriter.java:75) at org.apache.lucene.util.bkd.BKDWriter.add(BKDWriter.java:284) at org.apache.lucene.codecs.lucene60.Lucene60PointsWriter$1.visit(Lucene60PointsWriter.java:120) at org.apache.lucene.codecs.PointsWriter$1$1$1.visit(PointsWriter.java:117) at org.apache.lucene.index.AssertingLeafReader$AssertingIntersectVisitor.visit(AssertingLeafReader.java:1162) at org.apache.lucene.util.bkd.BKDReader.visitCompressedDocValues(BKDReader.java:522) at org.apache.lucene.util.bkd.BKDReader.visitDocValues(BKDReader.java:485) at org.apache.lucene.util.bkd.BKDReader.intersect(BKDReader.java:577) at org.apache.lucene.util.bkd.BKDReader.intersect(BKDReader.java:610) at org.apache.lucene.util.bkd.BKDReader.intersect(BKDReader.java:600) at org.apache.lucene.util.bkd.BKDReader.intersect(BKDReader.java:600) at org.apache.lucene.util.bkd.BKDReader.intersect(BKDReader.java:610) at org.apache.lucene.util.bkd.BKDReader.intersect(BKDReader.java:600) at org.apache.lucene.util.bkd.BKDReader.intersect(BKDReader.java:610) at org.apache.lucene.util.bkd.BKDReader.intersect(BKDReader.java:361) at org.apache.lucene.index.AssertingLeafReader$AssertingPointValues.intersect(AssertingLeafReader.java:1051) at org.apache.lucene.codecs.PointsWriter$1$1.intersect(PointsWriter.java:105) at org.apache.lucene.codecs.lucene60.Lucene60PointsWriter.writeField(Lucene60PointsWriter.java:113) at org.apache.lucene.codecs.PointsWriter.mergeOneField(PointsWriter.java:62) at org.apache.lucene.codecs.PointsWriter.merge(PointsWriter.java:191) at org.apache.lucene.codecs.lucene60.Lucene60PointsWriter.merge(Lucene60PointsWriter.java:145) at org.apache.lucene.codecs.asserting.AssertingPointsFormat$AssertingPointsWriter.merge(AssertingPointsFormat.java:142) at org.apache.lucene.index.SegmentMerger.mergePoints(SegmentMerger.java:201) at org.apache.lucene.index.SegmentMerger.merge(SegmentMerger.java:161) at org.apache.lucene.index.IndexWriter.mergeMiddle(IndexWriter.java:4483) at org.apache.lucene.index.IndexWriter.merge(IndexWriter.java:4078) at org.apache.lucene.index.SerialMergeScheduler.merge(SerialMergeScheduler.java:40) FAILED: org.apache.solr.client.solrj.io.stream.MathExpressionTest.testGammaDistribution Error Message: Stack Trace: java.lang.AssertionError at __randomizedtesting.SeedInfo.seed([4D135FCF35EACE53:7069746116926444]:0) at org.junit.Assert.fail(Assert.java:92) at org.junit.Assert.assertTrue(Assert.java:43) at org.junit.Assert.assertTrue(Assert.java:54) at org.apache.solr.client.solrj.io.stream.MathExpressionTest.testGammaDistribution(MathExpressionTest.java:4372) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at
[JENKINS] Lucene-Solr-repro - Build # 2040 - Still Unstable
Build: https://builds.apache.org/job/Lucene-Solr-repro/2040/ [...truncated 28 lines...] [repro] Jenkins log URL: https://builds.apache.org/job/Lucene-Solr-NightlyTests-master/1708/consoleText [repro] Revision: 2715beb6df77d7e5795b9c111a37178527cf3831 [repro] Ant options: -Dtests.multiplier=2 -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/test-data/enwiki.random.lines.txt [repro] Repro line: ant test -Dtestcase=SearchHandlerTest -Dtests.method=testRequireZkConnectedDistrib -Dtests.seed=813A3D1CE93389BC -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/test-data/enwiki.random.lines.txt -Dtests.locale=fr-CA -Dtests.timezone=GB-Eire -Dtests.asserts=true -Dtests.file.encoding=US-ASCII [repro] Repro line: ant test -Dtestcase=TestRebalanceLeaders -Dtests.method=test -Dtests.seed=813A3D1CE93389BC -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/test-data/enwiki.random.lines.txt -Dtests.locale=en-AU -Dtests.timezone=Etc/GMT+12 -Dtests.asserts=true -Dtests.file.encoding=US-ASCII [repro] Repro line: ant test -Dtestcase=IndexSizeTriggerTest -Dtests.method=testSplitIntegration -Dtests.seed=813A3D1CE93389BC -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/test-data/enwiki.random.lines.txt -Dtests.locale=pl -Dtests.timezone=Asia/Manila -Dtests.asserts=true -Dtests.file.encoding=US-ASCII [repro] Repro line: ant test -Dtestcase=RestartWhileUpdatingTest -Dtests.method=test -Dtests.seed=813A3D1CE93389BC -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/test-data/enwiki.random.lines.txt -Dtests.locale=sl-SI -Dtests.timezone=Europe/Lisbon -Dtests.asserts=true -Dtests.file.encoding=US-ASCII [repro] Repro line: ant test -Dtestcase=RestartWhileUpdatingTest -Dtests.seed=813A3D1CE93389BC -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/test-data/enwiki.random.lines.txt -Dtests.locale=sl-SI -Dtests.timezone=Europe/Lisbon -Dtests.asserts=true -Dtests.file.encoding=US-ASCII [repro] git rev-parse --abbrev-ref HEAD [repro] git rev-parse HEAD [repro] Initial local git branch/revision: 81c092d8262a68dfda3994e790f2e1f3fdf275e2 [repro] git fetch [repro] git checkout 2715beb6df77d7e5795b9c111a37178527cf3831 [...truncated 2 lines...] [repro] git merge --ff-only [...truncated 1 lines...] [repro] ant clean [...truncated 6 lines...] [repro] Test suites by module: [repro]solr/core [repro] SearchHandlerTest [repro] TestRebalanceLeaders [repro] IndexSizeTriggerTest [repro] RestartWhileUpdatingTest [repro] ant compile-test [...truncated 3573 lines...] [repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=20 -Dtests.class="*.SearchHandlerTest|*.TestRebalanceLeaders|*.IndexSizeTriggerTest|*.RestartWhileUpdatingTest" -Dtests.showOutput=onerror -Dtests.multiplier=2 -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/test-data/enwiki.random.lines.txt -Dtests.seed=813A3D1CE93389BC -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/test-data/enwiki.random.lines.txt -Dtests.locale=fr-CA -Dtests.timezone=GB-Eire -Dtests.asserts=true -Dtests.file.encoding=US-ASCII [...truncated 118918 lines...] [repro] Setting last failure code to 256 [repro] Failures: [repro] 0/5 failed: org.apache.solr.cloud.TestRebalanceLeaders [repro] 0/5 failed: org.apache.solr.cloud.autoscaling.IndexSizeTriggerTest [repro] 0/5 failed: org.apache.solr.handler.SearchHandlerTest [repro] 0/5 failed: org.apache.solr.handler.component.SearchHandlerTest [repro] 3/5 failed: org.apache.solr.cloud.RestartWhileUpdatingTest [repro] git checkout 81c092d8262a68dfda3994e790f2e1f3fdf275e2 [...truncated 2 lines...] [repro] Exiting with code 256 [...truncated 5 lines...] - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-NightlyTests-7.x - Build # 390 - Still Unstable
Build: https://builds.apache.org/job/Lucene-Solr-NightlyTests-7.x/390/ 8 tests failed. FAILED: org.apache.solr.cloud.LIROnShardRestartTest.testAllReplicasInLIR Error Message: Error from server at https://127.0.0.1:41924/solr: KeeperErrorCode = Session expired for /overseer/collection-queue-work/qnr- Stack Trace: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at https://127.0.0.1:41924/solr: KeeperErrorCode = Session expired for /overseer/collection-queue-work/qnr- at __randomizedtesting.SeedInfo.seed([538E2CEB11D6E6D3:916162D6F568134]:0) at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:643) at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:255) at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:244) at org.apache.solr.client.solrj.impl.LBHttpSolrClient.doRequest(LBHttpSolrClient.java:483) at org.apache.solr.client.solrj.impl.LBHttpSolrClient.request(LBHttpSolrClient.java:413) at org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1107) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:884) at org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:817) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:194) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:211) at org.apache.solr.cloud.LIROnShardRestartTest.testAllReplicasInLIR(LIROnShardRestartTest.java:175) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at
[JENKINS] Lucene-Solr-master-Linux (64bit/jdk-10.0.1) - Build # 23275 - Still Unstable!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/23275/ Java: 64bit/jdk-10.0.1 -XX:-UseCompressedOops -XX:+UseSerialGC 1 tests failed. FAILED: org.apache.solr.client.solrj.impl.CloudSolrClientTest.testVersionsAreReturned Error Message: Error from server at https://127.0.0.1:34301/solr/collection1_shard2_replica_n2: Expected mime type application/octet-stream but got text/html.Error 404 Can not find: /solr/collection1_shard2_replica_n2/update HTTP ERROR 404 Problem accessing /solr/collection1_shard2_replica_n2/update. Reason: Can not find: /solr/collection1_shard2_replica_n2/updatehttp://eclipse.org/jetty;>Powered by Jetty:// 9.4.11.v20180605 Stack Trace: org.apache.solr.client.solrj.impl.CloudSolrClient$RouteException: Error from server at https://127.0.0.1:34301/solr/collection1_shard2_replica_n2: Expected mime type application/octet-stream but got text/html. Error 404 Can not find: /solr/collection1_shard2_replica_n2/update HTTP ERROR 404 Problem accessing /solr/collection1_shard2_replica_n2/update. Reason: Can not find: /solr/collection1_shard2_replica_n2/updatehttp://eclipse.org/jetty;>Powered by Jetty:// 9.4.11.v20180605 at __randomizedtesting.SeedInfo.seed([20110CD7CBB09B49:D8D7F5E258A90381]:0) at org.apache.solr.client.solrj.impl.CloudSolrClient.directUpdate(CloudSolrClient.java:551) at org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1016) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:884) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:817) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:194) at org.apache.solr.client.solrj.request.UpdateRequest.commit(UpdateRequest.java:237) at org.apache.solr.client.solrj.impl.CloudSolrClientTest.testVersionsAreReturned(CloudSolrClientTest.java:725) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:564) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985) at org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:110) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891)
[jira] [Updated] (SOLR-12209) add Paging Streaming Expression
[ https://issues.apache.org/jira/browse/SOLR-12209?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] mosh updated SOLR-12209: Attachment: SOLR-12209.patch > add Paging Streaming Expression > --- > > Key: SOLR-12209 > URL: https://issues.apache.org/jira/browse/SOLR-12209 > Project: Solr > Issue Type: New Feature > Security Level: Public(Default Security Level. Issues are Public) > Components: streaming expressions >Reporter: mosh >Priority: Major > Attachments: SOLR-12209.patch, SOLR-12209.patch, SOLR-12209.patch, > SOLR-12209.patch > > > Currently the closest streaming expression that allows some sort of > pagination is top. > I propose we add a new streaming expression, which is based on the > RankedStream class to add offset to the stream. currently it can only be done > in code by reading the stream until the desired offset is reached. > The new expression will be used as such: > {{paging(rows=3, search(collection1, q="*:*", qt="/export", > fl="id,a_s,a_i,a_f", sort="a_f desc, a_i desc"), sort="a_f asc, a_i asc", > start=100)}} > {{this will offset the returned stream by 100 documents}} > > [~joel.bernstein] what to you think? -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-12209) add Paging Streaming Expression
[ https://issues.apache.org/jira/browse/SOLR-12209?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16703115#comment-16703115 ] mosh commented on SOLR-12209: - Uploaded a new patch file for the latest master. > add Paging Streaming Expression > --- > > Key: SOLR-12209 > URL: https://issues.apache.org/jira/browse/SOLR-12209 > Project: Solr > Issue Type: New Feature > Security Level: Public(Default Security Level. Issues are Public) > Components: streaming expressions >Reporter: mosh >Priority: Major > Attachments: SOLR-12209.patch, SOLR-12209.patch, SOLR-12209.patch, > SOLR-12209.patch > > > Currently the closest streaming expression that allows some sort of > pagination is top. > I propose we add a new streaming expression, which is based on the > RankedStream class to add offset to the stream. currently it can only be done > in code by reading the stream until the desired offset is reached. > The new expression will be used as such: > {{paging(rows=3, search(collection1, q="*:*", qt="/export", > fl="id,a_s,a_i,a_f", sort="a_f desc, a_i desc"), sort="a_f asc, a_i asc", > start=100)}} > {{this will offset the returned stream by 100 documents}} > > [~joel.bernstein] what to you think? -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-7.x-MacOSX (64bit/jdk-9) - Build # 960 - Still Unstable!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.x-MacOSX/960/ Java: 64bit/jdk-9 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC 1 tests failed. FAILED: org.apache.solr.cloud.RecoveryAfterSoftCommitTest.test Error Message: Didn't see all replicas for shard shard1 in collection1 come up within 3 ms! ClusterState: { "control_collection":{ "pullReplicas":"0", "replicationFactor":"1", "shards":{"shard1":{ "range":"8000-7fff", "state":"active", "replicas":{"core_node2":{ "core":"control_collection_shard1_replica_n1", "base_url":"http://127.0.0.1:56962/jv;, "node_name":"127.0.0.1:56962_jv", "state":"active", "type":"NRT", "leader":"true", "router":{"name":"compositeId"}, "maxShardsPerNode":"1", "autoAddReplicas":"false", "nrtReplicas":"1", "tlogReplicas":"0"}, "collection1":{ "pullReplicas":"0", "replicationFactor":"1", "shards":{"shard1":{ "range":"8000-7fff", "state":"active", "replicas":{ "core_node22":{ "core":"collection1_shard1_replica_n21", "base_url":"http://127.0.0.1:56978/jv;, "node_name":"127.0.0.1:56978_jv", "state":"active", "type":"NRT", "leader":"true"}, "core_node24":{ "core":"collection1_shard1_replica_n23", "base_url":"http://127.0.0.1:56988/jv;, "node_name":"127.0.0.1:56988_jv", "state":"recovering", "type":"NRT", "router":{"name":"compositeId"}, "maxShardsPerNode":"1", "autoAddReplicas":"false", "nrtReplicas":"1", "tlogReplicas":"0"}} Stack Trace: java.lang.AssertionError: Didn't see all replicas for shard shard1 in collection1 come up within 3 ms! ClusterState: { "control_collection":{ "pullReplicas":"0", "replicationFactor":"1", "shards":{"shard1":{ "range":"8000-7fff", "state":"active", "replicas":{"core_node2":{ "core":"control_collection_shard1_replica_n1", "base_url":"http://127.0.0.1:56962/jv;, "node_name":"127.0.0.1:56962_jv", "state":"active", "type":"NRT", "leader":"true", "router":{"name":"compositeId"}, "maxShardsPerNode":"1", "autoAddReplicas":"false", "nrtReplicas":"1", "tlogReplicas":"0"}, "collection1":{ "pullReplicas":"0", "replicationFactor":"1", "shards":{"shard1":{ "range":"8000-7fff", "state":"active", "replicas":{ "core_node22":{ "core":"collection1_shard1_replica_n21", "base_url":"http://127.0.0.1:56978/jv;, "node_name":"127.0.0.1:56978_jv", "state":"active", "type":"NRT", "leader":"true"}, "core_node24":{ "core":"collection1_shard1_replica_n23", "base_url":"http://127.0.0.1:56988/jv;, "node_name":"127.0.0.1:56988_jv", "state":"recovering", "type":"NRT", "router":{"name":"compositeId"}, "maxShardsPerNode":"1", "autoAddReplicas":"false", "nrtReplicas":"1", "tlogReplicas":"0"}} at __randomizedtesting.SeedInfo.seed([87E746602D8B6F98:FB379BA83770260]:0) at org.junit.Assert.fail(Assert.java:93) at org.apache.solr.cloud.AbstractFullDistribZkTestBase.ensureAllReplicasAreActive(AbstractFullDistribZkTestBase.java:2004) at org.apache.solr.cloud.RecoveryAfterSoftCommitTest.test(RecoveryAfterSoftCommitTest.java:119) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:564) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1010) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:985) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at
[jira] [Commented] (LUCENE-8563) Remove k1+1 from the numerator of BM25Similarity
[ https://issues.apache.org/jira/browse/LUCENE-8563?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16703055#comment-16703055 ] Robert Muir commented on LUCENE-8563: - Please deprecate the crazy legacy one too, so it can be eventually removed. > Remove k1+1 from the numerator of BM25Similarity > - > > Key: LUCENE-8563 > URL: https://issues.apache.org/jira/browse/LUCENE-8563 > Project: Lucene - Core > Issue Type: Improvement >Reporter: Adrien Grand >Priority: Minor > Time Spent: 40m > Remaining Estimate: 0h > > Our current implementation of BM25 does > {code:java} > boost * IDF * (k1+1) * tf / (tf + norm) > {code} > As (k1+1) is a constant, it is the same for every term and doesn't modify > ordering. It is often omitted and I found out that the "The Probabilistic > Relevance Framework: BM25 and Beyond" paper by Robertson (BM25's author) and > Zaragova even describes adding (k1+1) to the numerator as a variant whose > benefit is to be more comparable with Robertson/Sparck-Jones weighting, which > we don't care about. > {quote}A common variant is to add a (k1 + 1) component to the > numerator of the saturation function. This is the same for all > terms, and therefore does not affect the ranking produced. > The reason for including it was to make the final formula > more compatible with the RSJ weight used on its own > {quote} > Should we remove it from BM25Similarity as well? > A side-effect that I'm interested in is that integrating other score > contributions (eg. via oal.document.FeatureField) would be a bit easier to > reason about. For instance a weight of 3 in FeatureField#newSaturationQuery > would have a similar impact as a term whose IDF is 3 (and thus docFreq ~= 5%) > rather than a term whose IDF is 3/(k1 + 1). -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-8563) Remove k1+1 from the numerator of BM25Similarity
[ https://issues.apache.org/jira/browse/LUCENE-8563?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16703007#comment-16703007 ] Adrien Grand commented on LUCENE-8563: -- My gut feeling is that this change is going to be unnoticed by the vast majority of users as ordering is preserved. So I'd rather not require changes on their end to use this simpler implementation of BM25 and just document the change in runtime behavior in the release notes. I'm happy with keeping Solr on the current scoring formula and opening a follow-up issue to discuss how to handle the migration. [~lucacavanna] Based on Jan's comments, then let's configure Solr's BM25SimilarityFactory and SchemaSimilarityFactory to use the LegacyBM25Similarity that you added? > Remove k1+1 from the numerator of BM25Similarity > - > > Key: LUCENE-8563 > URL: https://issues.apache.org/jira/browse/LUCENE-8563 > Project: Lucene - Core > Issue Type: Improvement >Reporter: Adrien Grand >Priority: Minor > Time Spent: 40m > Remaining Estimate: 0h > > Our current implementation of BM25 does > {code:java} > boost * IDF * (k1+1) * tf / (tf + norm) > {code} > As (k1+1) is a constant, it is the same for every term and doesn't modify > ordering. It is often omitted and I found out that the "The Probabilistic > Relevance Framework: BM25 and Beyond" paper by Robertson (BM25's author) and > Zaragova even describes adding (k1+1) to the numerator as a variant whose > benefit is to be more comparable with Robertson/Sparck-Jones weighting, which > we don't care about. > {quote}A common variant is to add a (k1 + 1) component to the > numerator of the saturation function. This is the same for all > terms, and therefore does not affect the ranking produced. > The reason for including it was to make the final formula > more compatible with the RSJ weight used on its own > {quote} > Should we remove it from BM25Similarity as well? > A side-effect that I'm interested in is that integrating other score > contributions (eg. via oal.document.FeatureField) would be a bit easier to > reason about. For instance a weight of 3 in FeatureField#newSaturationQuery > would have a similar impact as a term whose IDF is 3 (and thus docFreq ~= 5%) > rather than a term whose IDF is 3/(k1 + 1). -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-7.x-Linux (64bit/jdk-9.0.4) - Build # 3156 - Still Unstable!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.x-Linux/3156/ Java: 64bit/jdk-9.0.4 -XX:-UseCompressedOops -XX:+UseConcMarkSweepGC 4 tests failed. FAILED: org.apache.solr.cloud.TestTlogReplica.testRecovery Error Message: Can not find doc 7 in https://127.0.0.1:46019/solr Stack Trace: java.lang.AssertionError: Can not find doc 7 in https://127.0.0.1:46019/solr at __randomizedtesting.SeedInfo.seed([BB122DF49DB043C6:7AE25458B0E08961]:0) at org.junit.Assert.fail(Assert.java:93) at org.junit.Assert.assertTrue(Assert.java:43) at org.junit.Assert.assertNotNull(Assert.java:526) at org.apache.solr.cloud.TestTlogReplica.checkRTG(TestTlogReplica.java:902) at org.apache.solr.cloud.TestTlogReplica.testRecovery(TestTlogReplica.java:567) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:564) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at java.base/java.lang.Thread.run(Thread.java:844) FAILED: org.apache.solr.cloud.TestTlogReplica.testRecovery Error Message: Can not find doc 7 in https://127.0.0.1:39111/solr Stack Trace:
[jira] [Commented] (LUCENE-8563) Remove k1+1 from the numerator of BM25Similarity
[ https://issues.apache.org/jira/browse/LUCENE-8563?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16702966#comment-16702966 ] Jan Høydahl commented on LUCENE-8563: - I think it would be a far better approach to create a new Similarity with a distinct name (NewBM25Similarity, CleanBM25Similarity, SimplifiedBM25Similarity or similar) for this, so Lucene users can explicitly make an informed choice, instead of changing the implementation of the existing class. Then this issue would not need to touch any Solr code whatsoever. If for some reason that is not possible, I think this is a classic example of a usecase for luceneMatchVersion conditional for Solr. If so, please create a new 8.0 *blocker* SOLR Jira issue about completing the Solr-side of things. > Remove k1+1 from the numerator of BM25Similarity > - > > Key: LUCENE-8563 > URL: https://issues.apache.org/jira/browse/LUCENE-8563 > Project: Lucene - Core > Issue Type: Improvement >Reporter: Adrien Grand >Priority: Minor > Time Spent: 40m > Remaining Estimate: 0h > > Our current implementation of BM25 does > {code:java} > boost * IDF * (k1+1) * tf / (tf + norm) > {code} > As (k1+1) is a constant, it is the same for every term and doesn't modify > ordering. It is often omitted and I found out that the "The Probabilistic > Relevance Framework: BM25 and Beyond" paper by Robertson (BM25's author) and > Zaragova even describes adding (k1+1) to the numerator as a variant whose > benefit is to be more comparable with Robertson/Sparck-Jones weighting, which > we don't care about. > {quote}A common variant is to add a (k1 + 1) component to the > numerator of the saturation function. This is the same for all > terms, and therefore does not affect the ranking produced. > The reason for including it was to make the final formula > more compatible with the RSJ weight used on its own > {quote} > Should we remove it from BM25Similarity as well? > A side-effect that I'm interested in is that integrating other score > contributions (eg. via oal.document.FeatureField) would be a bit easier to > reason about. For instance a weight of 3 in FeatureField#newSaturationQuery > would have a similar impact as a term whose IDF is 3 (and thus docFreq ~= 5%) > rather than a term whose IDF is 3/(k1 + 1). -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-7.6-Linux (64bit/jdk-10.0.1) - Build # 40 - Still Unstable!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.6-Linux/40/ Java: 64bit/jdk-10.0.1 -XX:-UseCompressedOops -XX:+UseG1GC 2 tests failed. FAILED: org.apache.solr.client.solrj.impl.CloudSolrClientTest.testRouting Error Message: Error from server at https://127.0.0.1:42357/solr/collection1_shard2_replica_n3: Expected mime type application/octet-stream but got text/html.Error 404 Can not find: /solr/collection1_shard2_replica_n3/update HTTP ERROR 404 Problem accessing /solr/collection1_shard2_replica_n3/update. Reason: Can not find: /solr/collection1_shard2_replica_n3/updatehttp://eclipse.org/jetty;>Powered by Jetty:// 9.4.11.v20180605 Stack Trace: org.apache.solr.client.solrj.impl.CloudSolrClient$RouteException: Error from server at https://127.0.0.1:42357/solr/collection1_shard2_replica_n3: Expected mime type application/octet-stream but got text/html. Error 404 Can not find: /solr/collection1_shard2_replica_n3/update HTTP ERROR 404 Problem accessing /solr/collection1_shard2_replica_n3/update. Reason: Can not find: /solr/collection1_shard2_replica_n3/updatehttp://eclipse.org/jetty;>Powered by Jetty:// 9.4.11.v20180605 at __randomizedtesting.SeedInfo.seed([6DAD47DC1050AD43:AF1A7BB413105D3B]:0) at org.apache.solr.client.solrj.impl.CloudSolrClient.directUpdate(CloudSolrClient.java:551) at org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1016) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:884) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:949) at org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:817) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:194) at org.apache.solr.client.solrj.request.UpdateRequest.commit(UpdateRequest.java:237) at org.apache.solr.client.solrj.impl.CloudSolrClientTest.testRouting(CloudSolrClientTest.java:269) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:564) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1742) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:935) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:971) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:985) at org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:110) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:944) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:830) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:880) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:891) at
[JENKINS] Lucene-Solr-repro - Build # 2039 - Unstable
Build: https://builds.apache.org/job/Lucene-Solr-repro/2039/ [...truncated 29 lines...] [repro] Jenkins log URL: https://builds.apache.org/job/Lucene-Solr-NightlyTests-7.6/13/consoleText [repro] Revision: ad753020aee6b63b155fe439af6f562afe88d5c1 [repro] Ant options: -Dtests.multiplier=2 -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.6/test-data/enwiki.random.lines.txt [repro] Repro line: ant test -Dtestcase=ShardSplitTest -Dtests.method=testSplitShardWithRuleLink -Dtests.seed=8E6503E321DE9C76 -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.6/test-data/enwiki.random.lines.txt -Dtests.locale=uk -Dtests.timezone=Zulu -Dtests.asserts=true -Dtests.file.encoding=UTF-8 [repro] Repro line: ant test -Dtestcase=TestManagedResourceStorage -Dtests.seed=8E6503E321DE9C76 -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.6/test-data/enwiki.random.lines.txt -Dtests.locale=zh-TW -Dtests.timezone=Europe/Mariehamn -Dtests.asserts=true -Dtests.file.encoding=UTF-8 [repro] Repro line: ant test -Dtestcase=CloudSolrClientTest -Dtests.method=testVersionsAreReturned -Dtests.seed=5F76BEE535F45FA3 -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.6/test-data/enwiki.random.lines.txt -Dtests.locale=fi-FI -Dtests.timezone=Antarctica/Mawson -Dtests.asserts=true -Dtests.file.encoding=UTF-8 [repro] git rev-parse --abbrev-ref HEAD [repro] git rev-parse HEAD [repro] Initial local git branch/revision: 81c092d8262a68dfda3994e790f2e1f3fdf275e2 [repro] git fetch [repro] git checkout ad753020aee6b63b155fe439af6f562afe88d5c1 [...truncated 2 lines...] [repro] git merge --ff-only [...truncated 1 lines...] [repro] ant clean [...truncated 6 lines...] [repro] Test suites by module: [repro]solr/solrj [repro] CloudSolrClientTest [repro]solr/core [repro] ShardSplitTest [repro] TestManagedResourceStorage [repro] ant compile-test [...truncated 2716 lines...] [repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=5 -Dtests.class="*.CloudSolrClientTest" -Dtests.showOutput=onerror -Dtests.multiplier=2 -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.6/test-data/enwiki.random.lines.txt -Dtests.seed=5F76BEE535F45FA3 -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.6/test-data/enwiki.random.lines.txt -Dtests.locale=fi-FI -Dtests.timezone=Antarctica/Mawson -Dtests.asserts=true -Dtests.file.encoding=UTF-8 [...truncated 1045 lines...] [repro] Setting last failure code to 256 [repro] ant compile-test [...truncated 1352 lines...] [repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=10 -Dtests.class="*.ShardSplitTest|*.TestManagedResourceStorage" -Dtests.showOutput=onerror -Dtests.multiplier=2 -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.6/test-data/enwiki.random.lines.txt -Dtests.seed=8E6503E321DE9C76 -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.6/test-data/enwiki.random.lines.txt -Dtests.locale=uk -Dtests.timezone=Zulu -Dtests.asserts=true -Dtests.file.encoding=UTF-8 [...truncated 165 lines...] [repro] Failures: [repro] 0/5 failed: org.apache.solr.cloud.api.collections.ShardSplitTest [repro] 0/5 failed: org.apache.solr.rest.TestManagedResourceStorage [repro] 2/5 failed: org.apache.solr.client.solrj.impl.CloudSolrClientTest [repro] git checkout 81c092d8262a68dfda3994e790f2e1f3fdf275e2 [...truncated 2 lines...] [repro] Exiting with code 256 [...truncated 5 lines...] - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-13000) log4j-slf4j-impl jar (inadvertently?) missing in solrj/lib
[ https://issues.apache.org/jira/browse/SOLR-13000?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16702871#comment-16702871 ] Daniel Collins commented on SOLR-13000: --- The Prometheus Exporter is a stand-alone tool so it requires a logging implementation, and since it is shipped with Solr, it seems reasonable for it to use the same logging implementation that Solr uses. So I guess the plan is to withdraw this (I'll confirm with [~cpoerschke] when she's back from vacation next week)? > log4j-slf4j-impl jar (inadvertently?) missing in solrj/lib > -- > > Key: SOLR-13000 > URL: https://issues.apache.org/jira/browse/SOLR-13000 > Project: Solr > Issue Type: Bug >Reporter: Christine Poerschke >Assignee: Christine Poerschke >Priority: Minor > Attachments: SOLR-13000.patch > > > My colleague [~ajhalani] noticed that declaring a {{solr-solrj}} dependency > alone when configuring {{-Dlog4j.configurationFile}} for a project is not > sufficient. > I looked into further and it seems that the {{log4j-slf4j-impl}} jar is > (inadvertently?) missing in {{solrj/lib}} in the release? -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Resolved] (LUCENE-8578) Can I do a lot of analysis on one field at the time of indexing?
[ https://issues.apache.org/jira/browse/LUCENE-8578?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Adrien Grand resolved LUCENE-8578. -- Resolution: Invalid Please ask questions on the user list (https://lucene.apache.org/core/discussion.html#java-user-list-java-userluceneapacheorg) instead. If your question is about Solr (I see you mentioned copyField), then you would be more likely to get answers on the solr-user list (http://lucene.apache.org/solr/community.html#solr-user-list-solr-userluceneapacheorg). > Can I do a lot of analysis on one field at the time of indexing? > > > Key: LUCENE-8578 > URL: https://issues.apache.org/jira/browse/LUCENE-8578 > Project: Lucene - Core > Issue Type: Improvement >Reporter: YOO JEONGIN >Priority: Major > > Hello > I have a question about index schemas. > 1) Can I do various analysis on one field? > For example, you can analyze the 'title' field with multiple tokenizers, and > merge the analysis into a single field. > 2) You can collect multiple fields in one field using 'copyField' function. > However, several fields have different data attributes (eg, category fields, > text fields, etc.) _) > At this time, I would like to analyze each field differently. > Do you have these features in version 7.5? Is there any kind of shortcut to > do these similar functions? > Thank you for your advice. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[GitHub] lucene-solr pull request #511: LUCENE-8563: Remove k1+1 from the numerator o...
Github user javanna commented on a diff in the pull request: https://github.com/apache/lucene-solr/pull/511#discussion_r237392155 --- Diff: lucene/MIGRATE.txt --- @@ -150,3 +150,11 @@ in order to support ToParent/ToChildBlockJoinQuery. Normalization is now type-safe, with CharFilterFactory#normalize() returning a Reader and TokenFilterFactory#normalize() returning a TokenFilter. + +## k1+1 constant factor removed from BM25 similarity numerator --- End diff -- Sure! --- - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[GitHub] lucene-solr pull request #511: LUCENE-8563: Remove k1+1 from the numerator o...
Github user javanna commented on a diff in the pull request: https://github.com/apache/lucene-solr/pull/511#discussion_r237392120 --- Diff: lucene/MIGRATE.txt --- @@ -150,3 +150,11 @@ in order to support ToParent/ToChildBlockJoinQuery. Normalization is now type-safe, with CharFilterFactory#normalize() returning a Reader and TokenFilterFactory#normalize() returning a TokenFilter. + +## k1+1 constant factor removed from BM25 similarity numerator + +Scores computed by the BM25 similarity are lower than previously as the k1+1 +constant factor was removed from the numerator of the scoring formula. +Ordering of results is preserved unless scores are computed from multiple +fields using different similarities. The previous behaviour is now exposed +through the LegacyBM25Similarity class. --- End diff -- yes I wasn't sure how to phrase that. Will add. --- - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-8563) Remove k1+1 from the numerator of BM25Similarity
[ https://issues.apache.org/jira/browse/LUCENE-8563?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16702856#comment-16702856 ] Adrien Grand commented on LUCENE-8563: -- Thanks [~lucacavanna] this looks good to me. [~softwaredoug] [~janhoy] Regarding Solr, would you rather like to always use this new BM25Similarity or only if the luceneMatchVersion is greater than or equal to 8? Given that Luca is adding a way to get the old scores as well, it should be easy to pick the right one depending on the luceneMatchVersion like Hoss did in SOLR-8261. > Remove k1+1 from the numerator of BM25Similarity > - > > Key: LUCENE-8563 > URL: https://issues.apache.org/jira/browse/LUCENE-8563 > Project: Lucene - Core > Issue Type: Improvement >Reporter: Adrien Grand >Priority: Minor > Time Spent: 20m > Remaining Estimate: 0h > > Our current implementation of BM25 does > {code:java} > boost * IDF * (k1+1) * tf / (tf + norm) > {code} > As (k1+1) is a constant, it is the same for every term and doesn't modify > ordering. It is often omitted and I found out that the "The Probabilistic > Relevance Framework: BM25 and Beyond" paper by Robertson (BM25's author) and > Zaragova even describes adding (k1+1) to the numerator as a variant whose > benefit is to be more comparable with Robertson/Sparck-Jones weighting, which > we don't care about. > {quote}A common variant is to add a (k1 + 1) component to the > numerator of the saturation function. This is the same for all > terms, and therefore does not affect the ranking produced. > The reason for including it was to make the final formula > more compatible with the RSJ weight used on its own > {quote} > Should we remove it from BM25Similarity as well? > A side-effect that I'm interested in is that integrating other score > contributions (eg. via oal.document.FeatureField) would be a bit easier to > reason about. For instance a weight of 3 in FeatureField#newSaturationQuery > would have a similar impact as a term whose IDF is 3 (and thus docFreq ~= 5%) > rather than a term whose IDF is 3/(k1 + 1). -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[GitHub] lucene-solr pull request #511: LUCENE-8563: Remove k1+1 from the numerator o...
Github user jpountz commented on a diff in the pull request: https://github.com/apache/lucene-solr/pull/511#discussion_r237388164 --- Diff: lucene/MIGRATE.txt --- @@ -150,3 +150,11 @@ in order to support ToParent/ToChildBlockJoinQuery. Normalization is now type-safe, with CharFilterFactory#normalize() returning a Reader and TokenFilterFactory#normalize() returning a TokenFilter. + +## k1+1 constant factor removed from BM25 similarity numerator --- End diff -- can you append ` (LUCENE-8563) ##` at the end of the line? --- - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[GitHub] lucene-solr pull request #511: LUCENE-8563: Remove k1+1 from the numerator o...
Github user jpountz commented on a diff in the pull request: https://github.com/apache/lucene-solr/pull/511#discussion_r237388295 --- Diff: lucene/MIGRATE.txt --- @@ -150,3 +150,11 @@ in order to support ToParent/ToChildBlockJoinQuery. Normalization is now type-safe, with CharFilterFactory#normalize() returning a Reader and TokenFilterFactory#normalize() returning a TokenFilter. + +## k1+1 constant factor removed from BM25 similarity numerator + +Scores computed by the BM25 similarity are lower than previously as the k1+1 +constant factor was removed from the numerator of the scoring formula. +Ordering of results is preserved unless scores are computed from multiple +fields using different similarities. The previous behaviour is now exposed +through the LegacyBM25Similarity class. --- End diff -- maybe add that it can be found in the lucene-misc jar? --- - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
Re: Jenkins test report improvements: Suspicious (Recent) Failure Rates
> from the runner, something would still need to be able to aggregate those > reports across multiple runs (ie: "ant test" being run in multiple dirs, That's correct. > Thinking about it purely from the point of view of the failure rate > reports i've been generating, it would be nice if *every* TestClass > invocation consistently included a single psuedo-method entry [...] The thing is this would probably break things elsewhere. Those XMLs reports, although not specified anywhere, are accepted by a range of tools and my xml listener just tries to mimic closely what Apache Ant and Apache Maven emit in similar circumstances. I wonder what happens if you had a project with a regular maven/ ant test output and executions of multiple instances of the same class... I probably cross-checked it when I was trying to mimic their output, but I bet it wasn't anticipated as a possible scenario... Another possibility would be to try to correct those aggregations of individual XML reports in Jenkins so that the output is easier to post-process... Don't know, really. It is a mess. D. - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org