[JENKINS] Lucene-Solr-repro-Java11 - Build # 90 - Unstable

2019-05-19 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-repro-Java11/90/

[...truncated 29 lines...]
[repro] Jenkins log URL: 
https://builds.apache.org/job/Lucene-Solr-BadApples-NightlyTests-master/62/consoleText

[repro] Revision: 18cb42ee80854e2159201fe550b13d894425a4f8

[repro] Ant options: -Dtests.multiplier=2 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-NightlyTests-master/test-data/enwiki.random.lines.txt
[repro] Repro line:  ant test  
-Dtestcase=TestDistributedStatsComponentCardinality -Dtests.method=test 
-Dtests.seed=299EC7FE081592CB -Dtests.multiplier=2 -Dtests.nightly=true 
-Dtests.slow=true -Dtests.badapples=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-NightlyTests-master/test-data/enwiki.random.lines.txt
 -Dtests.locale=pt-AO -Dtests.timezone=Africa/Porto-Novo -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=RollingRestartTest 
-Dtests.method=test -Dtests.seed=299EC7FE081592CB -Dtests.multiplier=2 
-Dtests.nightly=true -Dtests.slow=true -Dtests.badapples=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-NightlyTests-master/test-data/enwiki.random.lines.txt
 -Dtests.locale=smn-FI -Dtests.timezone=Pacific/Kwajalein -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestLockTree -Dtests.method=testLocks 
-Dtests.seed=299EC7FE081592CB -Dtests.multiplier=2 -Dtests.nightly=true 
-Dtests.slow=true -Dtests.badapples=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-NightlyTests-master/test-data/enwiki.random.lines.txt
 -Dtests.locale=sq -Dtests.timezone=Singapore -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] git rev-parse --abbrev-ref HEAD
[repro] git rev-parse HEAD
[repro] Initial local git branch/revision: 
18cb42ee80854e2159201fe550b13d894425a4f8
[repro] git fetch
[repro] git checkout 18cb42ee80854e2159201fe550b13d894425a4f8

[...truncated 1 lines...]
[repro] git merge --ff-only

[...truncated 1 lines...]
[repro] ant clean

[...truncated 6 lines...]
[repro] Test suites by module:
[repro]solr/core
[repro]   RollingRestartTest
[repro]   TestDistributedStatsComponentCardinality
[repro]   TestLockTree
[repro] ant compile-test

[...truncated 3309 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=15 
-Dtests.class="*.RollingRestartTest|*.TestDistributedStatsComponentCardinality|*.TestLockTree"
 -Dtests.showOutput=onerror -Dtests.multiplier=2 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-NightlyTests-master/test-data/enwiki.random.lines.txt
 -Dtests.seed=299EC7FE081592CB -Dtests.multiplier=2 -Dtests.nightly=true 
-Dtests.slow=true -Dtests.badapples=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-NightlyTests-master/test-data/enwiki.random.lines.txt
 -Dtests.locale=smn-FI -Dtests.timezone=Pacific/Kwajalein -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[...truncated 587326 lines...]
[repro] Setting last failure code to 256

[repro] Failures:
[repro]   0/5 failed: org.apache.solr.cloud.RollingRestartTest
[repro]   0/5 failed: org.apache.solr.cloud.TestLockTree
[repro]   4/5 failed: 
org.apache.solr.handler.component.TestDistributedStatsComponentCardinality
[repro] git checkout 18cb42ee80854e2159201fe550b13d894425a4f8

[...truncated 1 lines...]
[repro] Exiting with code 256

[...truncated 6 lines...]

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[JENKINS] Lucene-Solr-8.x-Windows (64bit/jdk1.8.0_201) - Build # 264 - Failure!

2019-05-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.x-Windows/264/
Java: 64bit/jdk1.8.0_201 -XX:+UseCompressedOops -XX:+UseParallelGC

All tests passed

Build Log:
[...truncated 65813 lines...]
[asciidoctor:convert] asciidoctor: ERROR: about-this-guide.adoc: line 1: 
invalid part, must have at least one section (e.g., chapter, appendix, etc.)
[asciidoctor:convert] asciidoctor: ERROR: solr-glossary.adoc: line 1: invalid 
part, must have at least one section (e.g., chapter, appendix, etc.)
 [java] Processed 2531 links (2070 relative) to 3359 anchors in 253 files
 [echo] Validated Links & Anchors via: 
C:\Users\jenkins\workspace\Lucene-Solr-8.x-Windows\solr\build\solr-ref-guide/bare-bones-html/

-documentation-lint:
[jtidy] Checking for broken html (such as invalid tags)...
   [delete] Deleting directory 
C:\Users\jenkins\workspace\Lucene-Solr-8.x-Windows\lucene\build\jtidy_tmp
 [echo] Checking for broken links...
 [exec] 
 [exec] Crawl/parse...
 [exec] 
 [exec] Verify...
 [echo] Checking for malformed docs...
 [exec] 
 [exec] 
C:\Users\jenkins\workspace\Lucene-Solr-8.x-Windows\solr\build\docs\solr-solrj/overview-summary.html
 [exec]   missing description: org.noggit
 [exec] 
 [exec] Missing javadocs were found!

BUILD FAILED
C:\Users\jenkins\workspace\Lucene-Solr-8.x-Windows\build.xml:634: The following 
error occurred while executing this line:
C:\Users\jenkins\workspace\Lucene-Solr-8.x-Windows\build.xml:101: The following 
error occurred while executing this line:
C:\Users\jenkins\workspace\Lucene-Solr-8.x-Windows\solr\build.xml:660: The 
following error occurred while executing this line:
C:\Users\jenkins\workspace\Lucene-Solr-8.x-Windows\solr\build.xml:676: The 
following error occurred while executing this line:
C:\Users\jenkins\workspace\Lucene-Solr-8.x-Windows\lucene\common-build.xml:2530:
 exec returned: 1

Total time: 134 minutes 6 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Setting 
ANT_1_8_2_HOME=C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Setting 
ANT_1_8_2_HOME=C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting 
ANT_1_8_2_HOME=C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2
Setting 
ANT_1_8_2_HOME=C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2
Setting 
ANT_1_8_2_HOME=C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2
Setting 
ANT_1_8_2_HOME=C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2
Setting 
ANT_1_8_2_HOME=C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2
Setting 
ANT_1_8_2_HOME=C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[JENKINS] Lucene-Solr-master-Linux (64bit/jdk-11) - Build # 24112 - Failure!

2019-05-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/24112/
Java: 64bit/jdk-11 -XX:-UseCompressedOops -XX:+UseParallelGC

All tests passed

Build Log:
[...truncated 62706 lines...]
-ecj-javadoc-lint-src:
[mkdir] Created dir: /tmp/ecj889564206
 [ecj-lint] Compiling 69 source files to /tmp/ecj889564206
 [ecj-lint] invalid Class-Path header in manifest of jar file: 
/home/jenkins/.ivy2/cache/org.restlet.jee/org.restlet/jars/org.restlet-2.3.0.jar
 [ecj-lint] invalid Class-Path header in manifest of jar file: 
/home/jenkins/.ivy2/cache/org.restlet.jee/org.restlet.ext.servlet/jars/org.restlet.ext.servlet-2.3.0.jar
 [ecj-lint] --
 [ecj-lint] 1. ERROR in 
/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
 (at line 28)
 [ecj-lint] import javax.naming.InitialContext;
 [ecj-lint]^^^
 [ecj-lint] The type javax.naming.InitialContext is not accessible
 [ecj-lint] --
 [ecj-lint] 2. ERROR in 
/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
 (at line 29)
 [ecj-lint] import javax.naming.NamingException;
 [ecj-lint]
 [ecj-lint] The type javax.naming.NamingException is not accessible
 [ecj-lint] --
 [ecj-lint] 3. ERROR in 
/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
 (at line 182)
 [ecj-lint] c = getFromJndi(initProps, jndiName);
 [ecj-lint] ^^^
 [ecj-lint] The method getFromJndi(Properties, String) from the type new 
Callable(){} refers to the missing type NamingException
 [ecj-lint] --
 [ecj-lint] 4. ERROR in 
/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
 (at line 245)
 [ecj-lint] private Connection getFromJndi(final Properties initProps, 
final String jndiName) throws NamingException,
 [ecj-lint] 
 ^^^
 [ecj-lint] NamingException cannot be resolved to a type
 [ecj-lint] --
 [ecj-lint] 5. ERROR in 
/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
 (at line 249)
 [ecj-lint] InitialContext ctx =  new InitialContext();
 [ecj-lint] ^^
 [ecj-lint] InitialContext cannot be resolved to a type
 [ecj-lint] --
 [ecj-lint] 6. ERROR in 
/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/JdbcDataSource.java
 (at line 249)
 [ecj-lint] InitialContext ctx =  new InitialContext();
 [ecj-lint]   ^^
 [ecj-lint] InitialContext cannot be resolved to a type
 [ecj-lint] --
 [ecj-lint] 6 problems (6 errors)

BUILD FAILED
/home/jenkins/workspace/Lucene-Solr-master-Linux/build.xml:634: The following 
error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-master-Linux/build.xml:101: The following 
error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build.xml:687: The 
following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/common-build.xml:479: The 
following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-master-Linux/lucene/common-build.xml:2010: 
The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-master-Linux/lucene/common-build.xml:2049: 
Compile failed; see the compiler error output for details.

Total time: 81 minutes 40 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Setting 
ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Setting 
ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting 
ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting 
ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting 
ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting 
ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting 
ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting 
ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For addit

[JENKINS] Lucene-Solr-SmokeRelease-8.x - Build # 102 - Still Failing

2019-05-19 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-SmokeRelease-8.x/102/

No tests ran.

Build Log:
[...truncated 23881 lines...]
[asciidoctor:convert] asciidoctor: ERROR: about-this-guide.adoc: line 1: 
invalid part, must have at least one section (e.g., chapter, appendix, etc.)
[asciidoctor:convert] asciidoctor: ERROR: solr-glossary.adoc: line 1: invalid 
part, must have at least one section (e.g., chapter, appendix, etc.)
 [java] Processed 2531 links (2070 relative) to 3359 anchors in 253 files
 [echo] Validated Links & Anchors via: 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-8.x/solr/build/solr-ref-guide/bare-bones-html/

-dist-changes:
 [copy] Copying 4 files to 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-8.x/solr/package/changes

package:

-unpack-solr-tgz:

-ensure-solr-tgz-exists:
[mkdir] Created dir: 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-8.x/solr/build/solr.tgz.unpacked
[untar] Expanding: 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-8.x/solr/package/solr-8.2.0.tgz
 into 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-8.x/solr/build/solr.tgz.unpacked

generate-maven-artifacts:

resolve:

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-8.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-8.x/lucene/top-level-ivy-settings.xml

resolve:

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-8.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-8.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-8.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-8.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-8.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-8.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-8.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-8.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-8.x/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:c

[jira] [Commented] (SOLR-13452) Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.

2019-05-19 Thread Uwe Schindler (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13452?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16843567#comment-16843567
 ] 

Uwe Schindler commented on SOLR-13452:
--

FYI: The very late addition of the servlet-api.txt signatures file to the task 
config does not hurt input/output change detection. As the signatures file is 
part of the build plugin's classpath (it's loaded with getResource), once the 
signatures file changes, the forbiddenApis task gets reexecuted anyways, 
because it's classpath changed. So adding it shortly before task execution is 
fine, as you can see it like a internal implementation detail of the plugin.

> Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.
> -
>
> Key: SOLR-13452
> URL: https://issues.apache.org/jira/browse/SOLR-13452
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Build
>Reporter: Mark Miller
>Priority: Major
>
> I took some things from the great work that Dat did in 
> [https://github.com/apache/lucene-solr/tree/jira/gradle] and took the ball a 
> little further.
>  
> When working with gradle in sub modules directly, I recommend 
> [https://github.com/dougborg/gdub]
> This gradle branch uses the following plugin for version locking, version 
> configuration and version consistency across modules: 
> [https://github.com/palantir/gradle-consistent-versions]
> By default, dependencies are not transitive, but there is a special 
> Configuration for adding dependencies on other project internal modules that 
> are transitive to their direct external dependencies (their jar libs).
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-8.x-Solaris (64bit/jdk1.8.0) - Build # 135 - Still Failing!

2019-05-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.x-Solaris/135/
Java: 64bit/jdk1.8.0 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC

1 tests failed.
FAILED:  
org.apache.solr.client.solrj.io.stream.StreamDecoratorTest.testParallelCommitStream

Error Message:
expected:<5> but was:<0>

Stack Trace:
java.lang.AssertionError: expected:<5> but was:<0>
at 
__randomizedtesting.SeedInfo.seed([3B3F31BCC540D036:1BD553BC59013D7A]:0)
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.failNotEquals(Assert.java:834)
at org.junit.Assert.assertEquals(Assert.java:645)
at org.junit.Assert.assertEquals(Assert.java:631)
at 
org.apache.solr.client.solrj.io.stream.StreamDecoratorTest.testParallelCommitStream(StreamDecoratorTest.java:3309)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.lang.Thread.run(Thread.java:748)




Build Log:
[...truncated 16343 lines...]
   [junit4] Suite: org.apache.solr.client.solrj.io.stream.StreamDecoratorTest
   [junit4] 

[JENKINS] Lucene-Solr-8.x-Linux (32bit/jdk1.8.0_201) - Build # 590 - Unstable!

2019-05-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.x-Linux/590/
Java: 32bit/jdk1.8.0_201 -server -XX:+UseG1GC

1 tests failed.
FAILED:  
org.apache.solr.cloud.LegacyCloudClusterPropTest.testCreateCollectionSwitchLegacyCloud

Error Message:
Error from server at http://127.0.0.1:44669/solr: Underlying core creation 
failed while creating collection: legacyFalse

Stack Trace:
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error 
from server at http://127.0.0.1:44669/solr: Underlying core creation failed 
while creating collection: legacyFalse
at 
__randomizedtesting.SeedInfo.seed([BF06C7A0AF19F1D9:6E0135250B167AEB]:0)
at 
org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:649)
at 
org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:255)
at 
org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:244)
at 
org.apache.solr.client.solrj.impl.LBSolrClient.doRequest(LBSolrClient.java:368)
at 
org.apache.solr.client.solrj.impl.LBSolrClient.request(LBSolrClient.java:296)
at 
org.apache.solr.client.solrj.impl.BaseCloudSolrClient.sendRequest(BaseCloudSolrClient.java:1068)
at 
org.apache.solr.client.solrj.impl.BaseCloudSolrClient.requestWithRetryOnStaleState(BaseCloudSolrClient.java:837)
at 
org.apache.solr.client.solrj.impl.BaseCloudSolrClient.request(BaseCloudSolrClient.java:769)
at 
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:207)
at 
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:224)
at 
org.apache.solr.cloud.LegacyCloudClusterPropTest.createAndTest(LegacyCloudClusterPropTest.java:95)
at 
org.apache.solr.cloud.LegacyCloudClusterPropTest.testCreateCollectionSwitchLegacyCloud(LegacyCloudClusterPropTest.java:79)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.Sta

[jira] [Commented] (SOLR-13452) Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.

2019-05-19 Thread Uwe Schindler (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13452?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16843565#comment-16843565
 ] 

Uwe Schindler commented on SOLR-13452:
--

I fixed the forbiddenapis and servlet-api checks: We do not add the 
servlet-api.txt signatures to the config until shortly before the task runs 
(using forbiddenTask.doFirst). At this time the fll dependencies are resolved 
and we check if the forbiddenapis classpath contains the servlet-api.jar file 
and add the signatures.

> Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.
> -
>
> Key: SOLR-13452
> URL: https://issues.apache.org/jira/browse/SOLR-13452
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Build
>Reporter: Mark Miller
>Priority: Major
>
> I took some things from the great work that Dat did in 
> [https://github.com/apache/lucene-solr/tree/jira/gradle] and took the ball a 
> little further.
>  
> When working with gradle in sub modules directly, I recommend 
> [https://github.com/dougborg/gdub]
> This gradle branch uses the following plugin for version locking, version 
> configuration and version consistency across modules: 
> [https://github.com/palantir/gradle-consistent-versions]
> By default, dependencies are not transitive, but there is a special 
> Configuration for adding dependencies on other project internal modules that 
> are transitive to their direct external dependencies (their jar libs).
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-13452) Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.

2019-05-19 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13452?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16843564#comment-16843564
 ] 

ASF subversion and git services commented on SOLR-13452:


Commit 075d3a14226852a9ae9ee5ef83ce0a348f054da9 in lucene-solr's branch 
refs/heads/jira/SOLR-13452_gradle from Uwe Schindler
[ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=075d3a1 ]

SOLR-13452: Fobiddenapis: Detect if "servlet-api.jar" is on classpath before 
task execution and add signatures at this time (after dependencies are resolved)


> Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.
> -
>
> Key: SOLR-13452
> URL: https://issues.apache.org/jira/browse/SOLR-13452
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Build
>Reporter: Mark Miller
>Priority: Major
>
> I took some things from the great work that Dat did in 
> [https://github.com/apache/lucene-solr/tree/jira/gradle] and took the ball a 
> little further.
>  
> When working with gradle in sub modules directly, I recommend 
> [https://github.com/dougborg/gdub]
> This gradle branch uses the following plugin for version locking, version 
> configuration and version consistency across modules: 
> [https://github.com/palantir/gradle-consistent-versions]
> By default, dependencies are not transitive, but there is a special 
> Configuration for adding dependencies on other project internal modules that 
> are transitive to their direct external dependencies (their jar libs).
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-13481) Re-try of solr request will not happen with different live servers, if one request throws Exception

2019-05-19 Thread Rajeswri Natarajan (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-13481?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rajeswri Natarajan updated SOLR-13481:
--
Description: 
LBHttpSolrClient.java needs to be fixed , as if the doRequest (called by 
request method below) method throws exception, the for loop will get terminated 
and the request will fail

 
 public Rsp request(Req req) throws SolrServerException, IOException {
  Rsp rsp = new Rsp();
  Exception ex = null;
  boolean isNonRetryable = req.request instanceof IsUpdateRequest ||
 ADMIN_PATHS.contains(req.request.getPath());
  List skipped = null;
  
  final Integer numServersToTry = req.getNumServersToTry();
  int numServersTried = 0;
  
  boolean timeAllowedExceeded = false;
  long timeAllowedNano = getTimeAllowedInNanos(req.getRequest());
  long timeOutTime = System.nanoTime() + timeAllowedNano;
  for (String serverStr : req.getServers()) {
    if (timeAllowedExceeded = isTimeExceeded(timeAllowedNano,
 timeOutTime))

{  break;    }

 
    serverStr = normalize(serverStr);
    // if the server is currently a zombie, just skip to the next one
    ServerWrapper wrapper = zombieServers.get(serverStr);
    if (wrapper != null) {
  // System.out.println("ZOMBIE SERVER QUERIED: " + serverStr);
  final int numDeadServersToTry = req.getNumDeadServersToTry();
  if (numDeadServersToTry > 0) {
    if (skipped == null)

{  skipped = new ArrayList<>(numDeadServersToTry);  
skipped.add(wrapper);    }

   else if (skipped.size() < numDeadServersToTry)

{  skipped.add(wrapper);    }

 }
  continue;
    }
    try {
  MDC.put("LBHttpSolrClient.url", serverStr);
  
  if (numServersToTry != null && numServersTried >
 numServersToTry.intValue())

{    break;  }
  
  HttpSolrClient client = makeSolrClient(serverStr);
  
  ++numServersTried;
  ex = doRequest(client, req, rsp, isNonRetryable, false, null);
  if (ex == null) \{    return rsp; // SUCCESS  }
    } finally \{  MDC.remove("LBHttpSolrClient.url");    }
  }
  
  // try the servers we previously skipped
  if (skipped != null) {
    for (ServerWrapper wrapper : skipped) \{  if 
(timeAllowedExceeded = isTimeExceeded(timeAllowedNano, timeOutTime)) \{ 
   break;  }
 
  
  if (numServersToTry != null && numServersTried >
 numServersToTry.intValue())
 \{    break;  }

 
  try {
    MDC.put("LBHttpSolrClient.url", wrapper.client.getBaseURL());
    ++numServersTried;
    ex = doRequest(wrapper.client, req, rsp, isNonRetryable, true,
 wrapper.getKey());
    if (ex == null)

{  return rsp; // SUCCESS    }

 } finally

{    MDC.remove("LBHttpSolrClient.url");  }

   }
  }
  
  
  final String solrServerExceptionMessage;
  if (timeAllowedExceeded)

{    solrServerExceptionMessage = "Time allowed to handle this request 
exceeded";  }

else {
    if (numServersToTry != null && numServersTried >
 numServersToTry.intValue())

{  solrServerExceptionMessage = "No live SolrServers available to 
handle this request:"  + " numServersTried="+numServersTried    
  + " numServersToTry="+numServersToTry.intValue();    }

else

{  solrServerExceptionMessage = "No live SolrServers available to 
handle this request";    }

 }
  if (ex == null)

{    throw new SolrServerException(solrServerExceptionMessage);  }

else

{    throw new SolrServerException(solrServerExceptionMessage+":" + 
zombieServers.keySet(), ex);  }

 
    }

  was:
LBHttpSolrClient.java needs to be fixed , as if the doRequest method throws 
exception , the for loop will get terminated and the request will fail

 
 public Rsp request(Req req) throws SolrServerException, IOException {
  Rsp rsp = new Rsp();
  Exception ex = null;
  boolean isNonRetryable = req.request instanceof IsUpdateRequest ||
 ADMIN_PATHS.contains(req.request.getPath());
  List skipped = null;
  
  final Integer numServersToTry = req.getNumServersToTry();
  int numServersTried = 0;
  
  boolean timeAllowedExceeded = false;
  long timeAllowedNano = getTimeAllowedInNanos(req.getRequest());
  long timeOutTime = System.nanoTime() + timeAllowedNano;
  for (String serverStr : req.getServers()) {
    if (timeAllowedExceeded = isTimeExceeded(timeAllowedNano,
 timeOutTime))

{  break;    }

 
    serverStr = normalize(serverStr);
    // if the server is currently a zombie, just skip to the next one
    ServerWrapper wrapper = zombieServers.get(serverStr);
    if (wrapper 

[jira] [Updated] (SOLR-13481) Re-try the solr request will not happen, if one request throws Exception

2019-05-19 Thread Rajeswri Natarajan (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-13481?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rajeswri Natarajan updated SOLR-13481:
--
Description: 
LBHttpSolrClient.java needs to be fixed , as if the doRequest method throws 
exception , the for loop will get terminated and the request will fail

 
 public Rsp request(Req req) throws SolrServerException, IOException {
  Rsp rsp = new Rsp();
  Exception ex = null;
  boolean isNonRetryable = req.request instanceof IsUpdateRequest ||
 ADMIN_PATHS.contains(req.request.getPath());
  List skipped = null;
  
  final Integer numServersToTry = req.getNumServersToTry();
  int numServersTried = 0;
  
  boolean timeAllowedExceeded = false;
  long timeAllowedNano = getTimeAllowedInNanos(req.getRequest());
  long timeOutTime = System.nanoTime() + timeAllowedNano;
  for (String serverStr : req.getServers()) {
    if (timeAllowedExceeded = isTimeExceeded(timeAllowedNano,
 timeOutTime))

{  break;    }

 
    serverStr = normalize(serverStr);
    // if the server is currently a zombie, just skip to the next one
    ServerWrapper wrapper = zombieServers.get(serverStr);
    if (wrapper != null) {
  // System.out.println("ZOMBIE SERVER QUERIED: " + serverStr);
  final int numDeadServersToTry = req.getNumDeadServersToTry();
  if (numDeadServersToTry > 0) {
    if (skipped == null)

{  skipped = new ArrayList<>(numDeadServersToTry);  
skipped.add(wrapper);    }

   else if (skipped.size() < numDeadServersToTry)

{  skipped.add(wrapper);    }

 }
  continue;
    }
    try {
  MDC.put("LBHttpSolrClient.url", serverStr);
  
  if (numServersToTry != null && numServersTried >
 numServersToTry.intValue())

{    break;  }
  
  HttpSolrClient client = makeSolrClient(serverStr);
  
  ++numServersTried;
  ex = doRequest(client, req, rsp, isNonRetryable, false, null);
  if (ex == null) \{    return rsp; // SUCCESS  }
    } finally \{  MDC.remove("LBHttpSolrClient.url");    }
  }
  
  // try the servers we previously skipped
  if (skipped != null) {
    for (ServerWrapper wrapper : skipped) {
  if (timeAllowedExceeded = isTimeExceeded(timeAllowedNano,
 timeOutTime)) \{    break;  }

 
  if (numServersToTry != null && numServersTried >
 numServersToTry.intValue())

{    break;  }

 
  try {
    MDC.put("LBHttpSolrClient.url", wrapper.client.getBaseURL());
    ++numServersTried;
    ex = doRequest(wrapper.client, req, rsp, isNonRetryable, true,
 wrapper.getKey());
    if (ex == null)

{  return rsp; // SUCCESS    }

 } finally

{    MDC.remove("LBHttpSolrClient.url");  }

   }
  }
  
  
  final String solrServerExceptionMessage;
  if (timeAllowedExceeded)

{    solrServerExceptionMessage = "Time allowed to handle this request 
exceeded";  }

else {
    if (numServersToTry != null && numServersTried >
 numServersToTry.intValue())

{  solrServerExceptionMessage = "No live SolrServers available to 
handle this request:"  + " numServersTried="+numServersTried    
  + " numServersToTry="+numServersToTry.intValue();    }

else

{  solrServerExceptionMessage = "No live SolrServers available to 
handle this request";    }

 }
  if (ex == null)

{    throw new SolrServerException(solrServerExceptionMessage);  }

else

{    throw new SolrServerException(solrServerExceptionMessage+":" + 
zombieServers.keySet(), ex);  }

 
    }

  was:
LBHttpSolrClient.java needs to be fixed , as if the doRequest method throws 
exception , the for loop will get terminated

 
public Rsp request(Req req) throws SolrServerException, IOException {
 Rsp rsp = new Rsp();
 Exception ex = null;
 boolean isNonRetryable = req.request instanceof IsUpdateRequest ||
ADMIN_PATHS.contains(req.request.getPath());
 List skipped = null;
 
 final Integer numServersToTry = req.getNumServersToTry();
 int numServersTried = 0;
 
 boolean timeAllowedExceeded = false;
 long timeAllowedNano = getTimeAllowedInNanos(req.getRequest());
 long timeOutTime = System.nanoTime() + timeAllowedNano;
 for (String serverStr : req.getServers()) {
   if (timeAllowedExceeded = isTimeExceeded(timeAllowedNano,
timeOutTime)) {
 break;
   }
 
   serverStr = normalize(serverStr);
   // if the server is currently a zombie, just skip to the next one
   ServerWrapper wrapper = zombieServers.get(serverStr);
   if (wrapper != null) {
 // System.out.println("ZOMBIE SERVER QUERIED: " + serverStr);
   

[jira] [Created] (SOLR-13481) Re-try the solr request will not happen, if one request throws Exception

2019-05-19 Thread Rajeswri Natarajan (JIRA)
Rajeswri Natarajan created SOLR-13481:
-

 Summary: Re-try the solr request will not happen, if one request 
throws Exception
 Key: SOLR-13481
 URL: https://issues.apache.org/jira/browse/SOLR-13481
 Project: Solr
  Issue Type: Bug
  Security Level: Public (Default Security Level. Issues are Public)
  Components: clients - java
Affects Versions: 7.6
Reporter: Rajeswri Natarajan


LBHttpSolrClient.java needs to be fixed , as if the doRequest method throws 
exception , the for loop will get terminated

 
public Rsp request(Req req) throws SolrServerException, IOException {
 Rsp rsp = new Rsp();
 Exception ex = null;
 boolean isNonRetryable = req.request instanceof IsUpdateRequest ||
ADMIN_PATHS.contains(req.request.getPath());
 List skipped = null;
 
 final Integer numServersToTry = req.getNumServersToTry();
 int numServersTried = 0;
 
 boolean timeAllowedExceeded = false;
 long timeAllowedNano = getTimeAllowedInNanos(req.getRequest());
 long timeOutTime = System.nanoTime() + timeAllowedNano;
 for (String serverStr : req.getServers()) {
   if (timeAllowedExceeded = isTimeExceeded(timeAllowedNano,
timeOutTime)) {
 break;
   }
 
   serverStr = normalize(serverStr);
   // if the server is currently a zombie, just skip to the next one
   ServerWrapper wrapper = zombieServers.get(serverStr);
   if (wrapper != null) {
 // System.out.println("ZOMBIE SERVER QUERIED: " + serverStr);
 final int numDeadServersToTry = req.getNumDeadServersToTry();
 if (numDeadServersToTry > 0) {
   if (skipped == null) {
 skipped = new ArrayList<>(numDeadServersToTry);
 skipped.add(wrapper);
   }
   else if (skipped.size() < numDeadServersToTry) {
 skipped.add(wrapper);
   }
 }
 continue;
   }
   try {
 MDC.put("LBHttpSolrClient.url", serverStr);
 
 if (numServersToTry != null && numServersTried >
numServersToTry.intValue()) {
   break;
 }
 
 HttpSolrClient client = makeSolrClient(serverStr);
 
 ++numServersTried;
 ex = doRequest(client, req, rsp, isNonRetryable, false, null);
 if (ex == null) {
   return rsp; // SUCCESS
 }
   } finally {
 MDC.remove("LBHttpSolrClient.url");
   }
 }
 
 // try the servers we previously skipped
 if (skipped != null) {
   for (ServerWrapper wrapper : skipped) {
 if (timeAllowedExceeded = isTimeExceeded(timeAllowedNano,
timeOutTime)) {
   break;
 }
 
 if (numServersToTry != null && numServersTried >
numServersToTry.intValue()) {
   break;
 }
 
 try {
   MDC.put("LBHttpSolrClient.url", wrapper.client.getBaseURL());
   ++numServersTried;
   ex = doRequest(wrapper.client, req, rsp, isNonRetryable, true,
wrapper.getKey());
   if (ex == null) {
 return rsp; // SUCCESS
   }
 } finally {
   MDC.remove("LBHttpSolrClient.url");
 }
   }
 }
 
 
 final String solrServerExceptionMessage;
 if (timeAllowedExceeded) {
   solrServerExceptionMessage = "Time allowed to handle this request
exceeded";
 } else {
   if (numServersToTry != null && numServersTried >
numServersToTry.intValue()) {
 solrServerExceptionMessage = "No live SolrServers available to
handle this request:"
 + " numServersTried="+numServersTried
 + " numServersToTry="+numServersToTry.intValue();
   } else {
 solrServerExceptionMessage = "No live SolrServers available to
handle this request";
   }
 }
 if (ex == null) {
   throw new SolrServerException(solrServerExceptionMessage);
 } else {
   throw new SolrServerException(solrServerExceptionMessage+":" +
zombieServers.keySet(), ex);
 }
 
   }



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-13452) Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.

2019-05-19 Thread Uwe Schindler (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13452?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16843560#comment-16843560
 ] 

Uwe Schindler commented on SOLR-13452:
--

Hi Mark,
I reworked the forbidden apis stuff to use more readable regexps and I added 
the missing system-out and commons-io checks.
As a workaround I changed the forbiddenapis-config for solr to allow missing 
signatures, but I will rework solr in a moment, so it works correct with 
compile-only (otherwise Dawid Weiss will complain...)

> Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.
> -
>
> Key: SOLR-13452
> URL: https://issues.apache.org/jira/browse/SOLR-13452
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Build
>Reporter: Mark Miller
>Priority: Major
>
> I took some things from the great work that Dat did in 
> [https://github.com/apache/lucene-solr/tree/jira/gradle] and took the ball a 
> little further.
>  
> When working with gradle in sub modules directly, I recommend 
> [https://github.com/dougborg/gdub]
> This gradle branch uses the following plugin for version locking, version 
> configuration and version consistency across modules: 
> [https://github.com/palantir/gradle-consistent-versions]
> By default, dependencies are not transitive, but there is a special 
> Configuration for adding dependencies on other project internal modules that 
> are transitive to their direct external dependencies (their jar libs).
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-13452) Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.

2019-05-19 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13452?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16843559#comment-16843559
 ] 

ASF subversion and git services commented on SOLR-13452:


Commit f2d89290bca597dc551bcd73b1cfa2c71dd91c00 in lucene-solr's branch 
refs/heads/jira/SOLR-13452_gradle from Uwe Schindler
[ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=f2d8929 ]

SOLR-13452: Fobiddenapis: add servlet-apis hack (temporrary); add commons-io; 
add system out checks; cleanup regexs


> Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.
> -
>
> Key: SOLR-13452
> URL: https://issues.apache.org/jira/browse/SOLR-13452
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Build
>Reporter: Mark Miller
>Priority: Major
>
> I took some things from the great work that Dat did in 
> [https://github.com/apache/lucene-solr/tree/jira/gradle] and took the ball a 
> little further.
>  
> When working with gradle in sub modules directly, I recommend 
> [https://github.com/dougborg/gdub]
> This gradle branch uses the following plugin for version locking, version 
> configuration and version consistency across modules: 
> [https://github.com/palantir/gradle-consistent-versions]
> By default, dependencies are not transitive, but there is a special 
> Configuration for adding dependencies on other project internal modules that 
> are transitive to their direct external dependencies (their jar libs).
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-master-Linux (64bit/jdk-11.0.2) - Build # 24111 - Unstable!

2019-05-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/24111/
Java: 64bit/jdk-11.0.2 -XX:+UseCompressedOops -XX:+UseSerialGC

1 tests failed.
FAILED:  
org.apache.solr.cloud.DeleteReplicaTest.deleteReplicaFromClusterStateLegacy

Error Message:
Waiting for watcher get removed

Stack Trace:
java.util.concurrent.TimeoutException: Waiting for watcher get removed
at 
__randomizedtesting.SeedInfo.seed([5EEA9584776BF966:41D230ECA6CD4EF5]:0)
at org.apache.solr.util.TimeOut.waitFor(TimeOut.java:66)
at 
org.apache.solr.cloud.DeleteReplicaTest.deleteReplicaFromClusterState(DeleteReplicaTest.java:245)
at 
org.apache.solr.cloud.DeleteReplicaTest.deleteReplicaFromClusterStateLegacy(DeleteReplicaTest.java:204)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.base/java.lang.Thread.run(Thread.java:834)




Build Log:
[...truncated 1 lines...]
   [junit4] Suite: org.apache.solr.cloud.DeleteReplicaTest
   [junit4]   2> 701450 INFO  
(SUITE-DeleteReplicaTest-seed#[5EEA9584776BF966]-worker) [] 
o.a.s

[jira] [Commented] (SOLR-11724) Cdcr Bootstrapping does not cause "index copying" to follower nodes on Target

2019-05-19 Thread Amrit Sarkar (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-11724?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16843554#comment-16843554
 ] 

Amrit Sarkar commented on SOLR-11724:
-

Thanks [~TimSolr] and numerous others on the mailing list for reporting.

The correct way of solving this issue is to identify the correct base-url of 
Solr for the respective core we need to trigger REQUESTRECOVERY to and create a 
local HttpSolrClient instead of using CloudSolrClient from CdcrReplicatorState 
(which will forward the request to the leader of the shard instead to the 
rightful solr node).

I baked a small patch a few weeks back, still need to work on tests properly to 
see why it is failing now.

> Cdcr Bootstrapping does not cause "index copying" to follower nodes on Target
> -
>
> Key: SOLR-11724
> URL: https://issues.apache.org/jira/browse/SOLR-11724
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: CDCR
>Reporter: Amrit Sarkar
>Assignee: Varun Thacker
>Priority: Major
> Fix For: 7.3.1, 7.4, 8.0
>
> Attachments: SOLR-11724.patch, SOLR-11724.patch, SOLR-11724.patch
>
>
> Please find the discussion on:
> http://lucene.472066.n3.nabble.com/Issue-with-CDCR-bootstrapping-in-Solr-7-1-td4365258.html
> If we index significant documents in to Source, stop indexing and then start 
> CDCR; bootstrapping only copies the index to leader node of shards of the 
> collection, and followers never receive the documents / index until and 
> unless atleast one document is inserted again on source; which propels to 
> target and target collection trigger index replication to followers.
> This behavior needs to be addressed in proper manner, either at target 
> collection or while bootstrapping.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-11724) Cdcr Bootstrapping does not cause "index copying" to follower nodes on Target

2019-05-19 Thread Amrit Sarkar (JIRA)


 [ 
https://issues.apache.org/jira/browse/SOLR-11724?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Amrit Sarkar updated SOLR-11724:

Attachment: SOLR-11724.patch

> Cdcr Bootstrapping does not cause "index copying" to follower nodes on Target
> -
>
> Key: SOLR-11724
> URL: https://issues.apache.org/jira/browse/SOLR-11724
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: CDCR
>Reporter: Amrit Sarkar
>Assignee: Varun Thacker
>Priority: Major
> Fix For: 7.3.1, 7.4, 8.0
>
> Attachments: SOLR-11724.patch, SOLR-11724.patch, SOLR-11724.patch
>
>
> Please find the discussion on:
> http://lucene.472066.n3.nabble.com/Issue-with-CDCR-bootstrapping-in-Solr-7-1-td4365258.html
> If we index significant documents in to Source, stop indexing and then start 
> CDCR; bootstrapping only copies the index to leader node of shards of the 
> collection, and followers never receive the documents / index until and 
> unless atleast one document is inserted again on source; which propels to 
> target and target collection trigger index replication to followers.
> This behavior needs to be addressed in proper manner, either at target 
> collection or while bootstrapping.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-13452) Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.

2019-05-19 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13452?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16843543#comment-16843543
 ] 

ASF subversion and git services commented on SOLR-13452:


Commit 4a9a2afe6d61fce078c877d54a1c7e8fd19bad5f in lucene-solr's branch 
refs/heads/jira/SOLR-13452_gradle from Uwe Schindler
[ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=4a9a2af ]

SOLR-13452: Cleanup of forbiddenapis to not hack around Gradle's problems, add 
ideas for servlet-apis.


> Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.
> -
>
> Key: SOLR-13452
> URL: https://issues.apache.org/jira/browse/SOLR-13452
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Build
>Reporter: Mark Miller
>Priority: Major
>
> I took some things from the great work that Dat did in 
> [https://github.com/apache/lucene-solr/tree/jira/gradle] and took the ball a 
> little further.
>  
> When working with gradle in sub modules directly, I recommend 
> [https://github.com/dougborg/gdub]
> This gradle branch uses the following plugin for version locking, version 
> configuration and version consistency across modules: 
> [https://github.com/palantir/gradle-consistent-versions]
> By default, dependencies are not transitive, but there is a special 
> Configuration for adding dependencies on other project internal modules that 
> are transitive to their direct external dependencies (their jar libs).
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-13452) Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.

2019-05-19 Thread Uwe Schindler (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13452?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16843541#comment-16843541
 ] 

Uwe Schindler commented on SOLR-13452:
--

Hi Mark,

I will commit soon some refactoring of forbiddenapis. It looks you copied the 
plugin code mostly from Elasticsearch - but you missed to copy the comment 
about the "+=" stuff. Actually the configuration is much easier if you but all 
signatures into each task and not in the superclock "forbiddenApis" (which is 
an extension function, so it can't easily inherit - one of the horrible details 
of Gradle). As the signatures differ anyways and we iterate over all tasks, 
it's better to define them all in the task. Then you can easily add sigantures.

I also added the exclusion for the hadoop test classes (in 
solr/core/build.gradle).

About the servlet-api: The problem is the following: As it's a compileOnly 
dependency, it's not exported by the project. So subprojects need to redefine 
the compileOnly dependency, if they actually use the classes. As the servlet 
API is provided by the servlet container at runtime, in reality it should be 
compileOnly, so you are right and every module that uses the classes has to use 
compileOnly.

The problem in forbiddenApis is that the signatures need to be parsed and this 
does not work if the classes are not in compile classpath... Therere are 2 ways 
to solve this:

- tell fobidden to ignore unknown sigantures files (I think Lucene's Maven 
build does this)
- only add servlet-api.txt, if the servlet-api is on the classpath. I added 
some "idea" - which does not yet work - to the code (commented out). It worked 
partially last thursday, but you changed the code so I was not able to followup.

The forbidden code has other problems:
- system out checking is still missing
- commons-io checks are also missing. I will add them with the correct version 
number like in Ant's build in the same way like I plan to do the servlet-api 
detection (look into dependencies / classpath and use correct version)

Please don't touch forbidden yet, I'd like to fix this, but it was a changing 
target...

Uwe

> Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.
> -
>
> Key: SOLR-13452
> URL: https://issues.apache.org/jira/browse/SOLR-13452
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Build
>Reporter: Mark Miller
>Priority: Major
>
> I took some things from the great work that Dat did in 
> [https://github.com/apache/lucene-solr/tree/jira/gradle] and took the ball a 
> little further.
>  
> When working with gradle in sub modules directly, I recommend 
> [https://github.com/dougborg/gdub]
> This gradle branch uses the following plugin for version locking, version 
> configuration and version consistency across modules: 
> [https://github.com/palantir/gradle-consistent-versions]
> By default, dependencies are not transitive, but there is a special 
> Configuration for adding dependencies on other project internal modules that 
> are transitive to their direct external dependencies (their jar libs).
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-BadApples-Tests-master - Build # 365 - Failure

2019-05-19 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-BadApples-Tests-master/365/

1 tests failed.
FAILED:  org.apache.solr.cloud.MultiThreadedOCPTest.test

Error Message:
acoll: 1558287182387 bcoll: 1558287182504

Stack Trace:
java.lang.AssertionError: acoll: 1558287182387 bcoll: 1558287182504
at 
__randomizedtesting.SeedInfo.seed([BDBCDB7A59ABC05F:35E8E4A0F757ADA7]:0)
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.assertTrue(Assert.java:41)
at 
org.apache.solr.cloud.MultiThreadedOCPTest.testFillWorkQueue(MultiThreadedOCPTest.java:116)
at 
org.apache.solr.cloud.MultiThreadedOCPTest.test(MultiThreadedOCPTest.java:71)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1082)
at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1054)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.base/java.lang.Thread.run(Th

Re: Welcome Michael Sokolov as Lucene/ Solr committer

2019-05-19 Thread Michael McCandless
Welcome Mike!

Mike

On Mon, May 13, 2019 at 12:12 PM Dawid Weiss  wrote:

> Hello everyone,
>
> Please join me in welcoming Michael Sokolov as Lucene/ Solr committer!
>
> Many of you probably know Mike as he's been around for quite a while
> -- answering questions, reviewing patches, providing insight and
> actively working on new code.
>
> Congratulations and welcome! It is a tradition to introduce yourself
> with a brief bio, Mike.
>
> Dawid
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
> For additional commands, e-mail: dev-h...@lucene.apache.org
>
> --
Mike McCandless

http://blog.mikemccandless.com


Re: 8.1.1 bug fix release

2019-05-19 Thread Jan Høydahl
7.7.2 smoke builds are passing and I have a RC1 build of 7.7.2 ready, but will 
hold it until after 8.1.1 vote.
If you for some reason won't be ready with 8.1.1 in several days, let me know 
and we can do the 7.7.2 vote first.
If so happens we'd need to sync the addition of 7.7.2 changes sections in 
branch_8_1.

--
Jan Høydahl, search solution architect
Cominvent AS - www.cominvent.com

> 17. mai 2019 kl. 18:27 skrev Jan Høydahl :
> 
> Thanks. Fixed.
> 
> --
> Jan Høydahl, search solution architect
> Cominvent AS - www.cominvent.com 
> 
>> 17. mai 2019 kl. 17:28 skrev Andrzej Białecki 
>> mailto:andrzej.biale...@lucidworks.com>>:
>> 
>> Jan, re. 7.7.2 solr/CHANGES.txt, the entry for SOLR-12833 in the 7.7.2 "Bug 
>> Fixes" section is somewhat misleading, because the main issue fixed between 
>> 7.7.1 and 7.7.2 was to reduce the memory consumption. I propose to add the 
>> following (or replace the existing entry, since it only affects the test):
>> 
>> * SOLR-12833: Avoid unnecessary memory cost when DistributedUpdateProcessor 
>> timed-out lock is not used. (ab)
>> 
>>> On 17 May 2019, at 13:49, Jan Høydahl >> > wrote:
>>> 
>>> We stil don't have a successful Jenkins Lucene-Solr-SmokeRelease-7.7 build, 
>>> and I'm trying to get a successful local build.
>>> So I'm OK to delay 7.7.2 RC until after 8.1.1 vote if you are ready to spin 
>>> first :)
>>> 
>>> --
>>> Jan Høydahl, search solution architect
>>> Cominvent AS - www.cominvent.com 
>>> 
 17. mai 2019 kl. 09:09 skrev Ishan Chattopadhyaya 
 mailto:ichattopadhy...@gmail.com>>:
 
 I'm okay with the timelines, if Jan is okay too. If we do this release 
 first, the sooner we can unlock him, the better. (Can we cut an RC today?).
 
 On Thu, 16 May, 2019, 10:33 PM Ishan Chattopadhyaya, 
 mailto:ichattopadhy...@gmail.com>> wrote:
 Thanks Andrzej. I recommend you get all access (Jira, wiki etc), complete 
 all GPG key prerequisites beforehand. I remember wasting a day on those 
 when I did my first release.
 
 On Thu, 16 May, 2019, 10:24 PM Andrzej Białecki, >>> > wrote:
 Hi,
 
 Just right after the 8.1.0 release has been published we’ve discovered a 
 serious bug in the way aliases are handled in the Admin UI (and in some 
 cases leading to NPE when using API). Details of the bug can be found in 
 SOLR-13475.
 
 I’m volunteering to do this release, if there are no objections, and I’d 
 like to create a RC early next week (I need to get up to speed on the 
 release process ;) ).
 
 —
 
 Andrzej Białecki
 
 
 -
 To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org 
 
 For additional commands, e-mail: dev-h...@lucene.apache.org 
 
 
>>> 
>> 
> 



Moving the Lucene / Solr build system from Ant+Ivy+MavenShadowBuild to Gradle.

2019-05-19 Thread Mark Miller
Hey all,

I know a few people expressed interest in helping out with moving the build
to gradle.

Things are starting to shape up pretty nicely, so probably a good time to
help wrap up the long tail of little bs. If you are interested and want a
somewhat isolated and tractable task, hit me up.

If you are less inclined or able to pitch in in that way, I could certainly
use some feedback from casual devs downloading and trying things out.

Let me know if everything immediately brings your system to a crawl for
example. I've got some parallelism by default for a pretty beefy CPU and we
will need to start working towards better defaults across more developers.

JIRA
SOLR-13452: Update the lucene-solr build from Ivy+Ant+Maven (shadow build)
to Gradle.
https://issues.apache.org/jira/browse/SOLR-12376

Source
https://github.com/apache/lucene-solr/tree/jira/SOLR-13452_gradle

-- 
- Mark

http://about.me/markrmiller


[jira] [Commented] (SOLR-13437) fork noggit code to Solr

2019-05-19 Thread JIRA


[ 
https://issues.apache.org/jira/browse/SOLR-13437?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16843520#comment-16843520
 ] 

Jan Høydahl commented on SOLR-13437:


documentation-lint fails due to this 
([https://jenkins.thetaphi.de/job/Lucene-Solr-8.x-Solaris/133/])
{noformat}
-documentation-lint:
   [jtidy] Checking for broken html (such as invalid tags)...
  [delete] Deleting directory 
/export/home/jenkins/workspace/Lucene-Solr-8.x-Solaris/lucene/build/jtidy_tmp
[echo] Checking for broken links...
[exec] 
[exec] Crawl/parse...
[exec] 
[exec] Verify...
[echo] Checking for malformed docs...
[exec] 
[exec] 
/export/home/jenkins/workspace/Lucene-Solr-8.x-Solaris/solr/build/docs/solr-solrj/overview-summary.html
[exec]   missing description: org.noggit
[exec] 
[exec] Missing javadocs were found!

{noformat}

> fork noggit code to Solr
> 
>
> Key: SOLR-13437
> URL: https://issues.apache.org/jira/browse/SOLR-13437
> Project: Solr
>  Issue Type: Task
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: SolrJ
>Reporter: Noble Paul
>Assignee: Noble Paul
>Priority: Major
>  Time Spent: 1h 50m
>  Remaining Estimate: 0h
>
> We rely on noggit for all our JSON encoding/decoding needs.The main project 
> is not actively maintained . We cannot easily switch to another parser 
> because it may cause backward incompatibility and we have advertised the 
> ability to use flexible JSON and we also use noggit internally in many classes



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-13452) Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.

2019-05-19 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13452?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16843513#comment-16843513
 ] 

ASF subversion and git services commented on SOLR-13452:


Commit ce644f4d0dcea307dd682b99307d71a5618faf7d in lucene-solr's branch 
refs/heads/jira/SOLR-13452_gradle from Mark Robert Miller
[ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=ce644f4 ]

SOLR-13452: Fix forbiddenapis use of endsWith to matches.


> Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.
> -
>
> Key: SOLR-13452
> URL: https://issues.apache.org/jira/browse/SOLR-13452
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Build
>Reporter: Mark Miller
>Priority: Major
>
> I took some things from the great work that Dat did in 
> [https://github.com/apache/lucene-solr/tree/jira/gradle] and took the ball a 
> little further.
>  
> When working with gradle in sub modules directly, I recommend 
> [https://github.com/dougborg/gdub]
> This gradle branch uses the following plugin for version locking, version 
> configuration and version consistency across modules: 
> [https://github.com/palantir/gradle-consistent-versions]
> By default, dependencies are not transitive, but there is a special 
> Configuration for adding dependencies on other project internal modules that 
> are transitive to their direct external dependencies (their jar libs).
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-13452) Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.

2019-05-19 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13452?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16843512#comment-16843512
 ] 

ASF subversion and git services commented on SOLR-13452:


Commit fc2abe85b6f3956c731feac761d3bc42152cdf6d in lucene-solr's branch 
refs/heads/jira/SOLR-13452_gradle from Mark Robert Miller
[ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=fc2abe8 ]

SOLR-13452: A little cleanup around forbiddenapis and get apache rat running on 
src directories (more to cover with rat).


> Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.
> -
>
> Key: SOLR-13452
> URL: https://issues.apache.org/jira/browse/SOLR-13452
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Build
>Reporter: Mark Miller
>Priority: Major
>
> I took some things from the great work that Dat did in 
> [https://github.com/apache/lucene-solr/tree/jira/gradle] and took the ball a 
> little further.
>  
> When working with gradle in sub modules directly, I recommend 
> [https://github.com/dougborg/gdub]
> This gradle branch uses the following plugin for version locking, version 
> configuration and version consistency across modules: 
> [https://github.com/palantir/gradle-consistent-versions]
> By default, dependencies are not transitive, but there is a special 
> Configuration for adding dependencies on other project internal modules that 
> are transitive to their direct external dependencies (their jar libs).
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-13452) Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.

2019-05-19 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/SOLR-13452?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16843511#comment-16843511
 ] 

ASF subversion and git services commented on SOLR-13452:


Commit c0f3533dabcea9014f9f96a1b33007f27863d706 in lucene-solr's branch 
refs/heads/jira/SOLR-13452_gradle from Mark Robert Miller
[ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=c0f3533 ]

SOLR-13452: Fix runjflex description.


> Update the lucene-solr build from Ivy+Ant+Maven (shadow build) to Gradle.
> -
>
> Key: SOLR-13452
> URL: https://issues.apache.org/jira/browse/SOLR-13452
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Build
>Reporter: Mark Miller
>Priority: Major
>
> I took some things from the great work that Dat did in 
> [https://github.com/apache/lucene-solr/tree/jira/gradle] and took the ball a 
> little further.
>  
> When working with gradle in sub modules directly, I recommend 
> [https://github.com/dougborg/gdub]
> This gradle branch uses the following plugin for version locking, version 
> configuration and version consistency across modules: 
> [https://github.com/palantir/gradle-consistent-versions]
> By default, dependencies are not transitive, but there is a special 
> Configuration for adding dependencies on other project internal modules that 
> are transitive to their direct external dependencies (their jar libs).
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8805) Parameter changes for binaryField() and stringField() in StoredFieldVisitor

2019-05-19 Thread Namgyu Kim (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8805?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16843485#comment-16843485
 ] 

Namgyu Kim commented on LUCENE-8805:


Thank you for your reply, and I'm sorry for late reply. [~rcmuir]
I will upload a new patch within a few days, based on your feedback.
(parameter checking, creating a TC, etc...)

> Parameter changes for binaryField() and stringField() in StoredFieldVisitor
> ---
>
> Key: LUCENE-8805
> URL: https://issues.apache.org/jira/browse/LUCENE-8805
> Project: Lucene - Core
>  Issue Type: Improvement
>Reporter: Namgyu Kim
>Priority: Major
> Attachments: LUCENE-8805.patch
>
>
> I wrote this patch after seeing the comments left by [~mikemccand] when 
> SortingStoredFieldsConsumer class was first created.
> {code:java}
> @Override
> public void binaryField(FieldInfo fieldInfo, byte[] value) throws IOException 
> {
>   ...
>   // TODO: can we avoid new BR here?
>   ...
> }
> @Override
> public void stringField(FieldInfo fieldInfo, byte[] value) throws IOException 
> {
>   ...
>   // TODO: can we avoid new String here?
>   ...
> }
> {code}
> I changed two things.
> 1) change binaryField() parameters from byte[] to BytesRef.
> 2) change stringField() parameters from byte[] to String.
> I also changed the related contents while doing the work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8689) Boolean DocValues Codec Implementation

2019-05-19 Thread Erick Erickson (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8689?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16843482#comment-16843482
 ] 

Erick Erickson commented on LUCENE-8689:


I haven't looked at the patch, and in particular the docValues stuff for 
boolean fields in a _long_ time, so this may be totally off base. I wanted to 
ask how back-compat is handled? Particularly if there are old-style and 
new-style segments in the index. There are still lines like:

 
{code}
@Override
public String toInternal(String val) {
 char ch = (val!=null && val.length()>0) ? val.charAt(0) : 0;
 return (ch=='1' || ch=='t' || ch=='T') ? "T" : "F";
}
{code}
 

in BoolField for instance. 

> Boolean DocValues Codec Implementation
> --
>
> Key: LUCENE-8689
> URL: https://issues.apache.org/jira/browse/LUCENE-8689
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: core/codecs
>Reporter: Ivan Mamontov
>Priority: Minor
>  Labels: patch, performance
> Attachments: LUCENE-8689.patch, LUCENE-8689.patch, 
> SynteticDocValuesBench70.java, SynteticDocValuesBench80.java, 
> benchmark_dense.png, boolean_vs_dense_vs_sparse_indexing.png, 
> boolean_vs_dense_vs_sparse_updates.png, dense_vs_sparse_querying.png, 
> results2.png
>
>
> To avoid issues where some products become available/unavailable at some 
> point in time after being out-of-stock, e-commerce search system designers 
> need to embed up-to-date information about inventory availability right into 
> the search engines. Key requirement is to be able to accurately filter out 
> unavailable products and use availability as one of ranking signals. However, 
> keeping availability data up-to-date is a non-trivial task. Straightforward 
> implementation based on a partial updates of Lucene documents causes Solr 
> cache trashing with negatively affected query performance and resource 
> utilization.
>  As an alternative solution we can use DocValues and build-in in-place 
> updates where field values can be independently updated without touching 
> inverted index, and while filtering by DocValues is a bit slower, overall 
> performance gain is better. However existing long based docValues are not 
> sufficiently optimized for carrying boolean inventory availability data:
>  * All DocValues queries are internally rewritten into 
> org.apache.lucene.search.DocValuesNumbersQuery which is based on direct 
> iteration over all column values and typically much slower than using 
> TermsQuery.
>  * On every commit/merge codec has to iterate over DocValues a couple times 
> in order to choose the best compression algorithm suitable for given data. As 
> a result for 4K fields and 3M max doc merge takes more than 10 minutes
> This issue is intended to solve these limitations via special bitwise doc 
> values format that uses internal representation of 
> org.apache.lucene.util.FixedBitSet in order to store indexed values and load 
> them at search time as a simple long array without additional decoding. There 
> are several reasons for this:
>  * At index time encoding is super fast without superfluous iterations over 
> all values to choose the best compression algorithm suitable for given data.
>  * At query time decoding is also simple and fast, no GC pressure and extra 
> steps
>  * Internal representation allows to perform random access in constant time
> Limitations are:
>  * Does not support non boolean fields
>  * Boolean fields must be represented as long values 1 for true and 0 for 
> false
>  * Current implementation does not support advanced bit set formats like 
> org.apache.lucene.util.SparseFixedBitSet or 
> org.apache.lucene.util.RoaringDocIdSet
> In order to evaluate performance gain I've wrote a simple JMH based benchmark 
> [^SynteticDocValuesBench70.java] which allows to estimate a relative cost of 
> DF filters. This benchmark creates 2 000 000 documents with 5 boolean columns 
> with different density, where 10, 35, 50, 60 and 90 is an amount of documents 
> with value 1. Each method tries to enumerate over all values in synthetic 
> store field in all available ways:
>  * baseline – in almost all cases Solr uses FixedBitSet in filter cache to 
> keep store availability. This test just iterates over all bits.
>  * docValuesRaw – iterates over all values of DV column, the same code is 
> used in "post filtering", sorting and faceting.
>  * docValuesNumbersQuery – iterates over all values produced by query/filter 
> store:1, actually there is the only query implementation for DV based fields 
> - DocValuesNumbersQuery. This means that Lucene rewrites all term, range and 
> filter queries for non indexed filed into this fallback implementation.
>  * docValuesBooleanQuery – optimized variant of DocValuesNumbersQuery, which 
> support only two

[JENKINS] Lucene-Solr-SmokeRelease-master - Build # 1339 - Still Failing

2019-05-19 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-SmokeRelease-master/1339/

No tests ran.

Build Log:
[...truncated 23471 lines...]
[asciidoctor:convert] asciidoctor: ERROR: about-this-guide.adoc: line 1: 
invalid part, must have at least one section (e.g., chapter, appendix, etc.)
[asciidoctor:convert] asciidoctor: ERROR: solr-glossary.adoc: line 1: invalid 
part, must have at least one section (e.g., chapter, appendix, etc.)
 [java] Processed 2531 links (2070 relative) to 3360 anchors in 253 files
 [echo] Validated Links & Anchors via: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/build/solr-ref-guide/bare-bones-html/

-dist-changes:
 [copy] Copying 4 files to 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/package/changes

package:

-unpack-solr-tgz:

-ensure-solr-tgz-exists:
[mkdir] Created dir: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/build/solr.tgz.unpacked
[untar] Expanding: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/package/solr-9.0.0.tgz
 into 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/build/solr.tgz.unpacked

generate-maven-artifacts:

resolve:

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml

resolve:

ivy-availability-check:
[loadresource] Do not set property disallowed.ivy.jars.list as its length is 0.

-ivy-fail-disallowed-ivy-version:

ivy-fail:

i

[JENKINS] Lucene-Solr-Tests-7.7 - Build # 21 - Unstable

2019-05-19 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-Tests-7.7/21/

2 tests failed.
FAILED:  org.apache.solr.cloud.OverseerTest.testOverseerFailure

Error Message:
Test abandoned because suite timeout was reached.

Stack Trace:
java.lang.Exception: Test abandoned because suite timeout was reached.
at __randomizedtesting.SeedInfo.seed([A772547C3688179A]:0)


FAILED:  junit.framework.TestSuite.org.apache.solr.cloud.OverseerTest

Error Message:
Suite timeout exceeded (>= 720 msec).

Stack Trace:
java.lang.Exception: Suite timeout exceeded (>= 720 msec).
at __randomizedtesting.SeedInfo.seed([A772547C3688179A]:0)




Build Log:
[...truncated 15753 lines...]
   [junit4] Suite: org.apache.solr.cloud.OverseerTest
   [junit4]   2> Creating dataDir: 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-7.7/solr/build/solr-core/test/J1/temp/solr.cloud.OverseerTest_A772547C3688179A-001/init-core-data-001
   [junit4]   2> 3004274 WARN  
(SUITE-OverseerTest-seed#[A772547C3688179A]-worker) [] o.a.s.SolrTestCaseJ4 
startTrackingSearchers: numOpens=3 numCloses=3
   [junit4]   2> 3004274 INFO  
(SUITE-OverseerTest-seed#[A772547C3688179A]-worker) [] o.a.s.SolrTestCaseJ4 
Using PointFields (NUMERIC_POINTS_SYSPROP=true) 
w/NUMERIC_DOCVALUES_SYSPROP=false
   [junit4]   2> 3004276 INFO  
(SUITE-OverseerTest-seed#[A772547C3688179A]-worker) [] o.a.s.SolrTestCaseJ4 
Randomized ssl (true) and clientAuth (true) via: 
@org.apache.solr.util.RandomizeSSL(reason=, ssl=NaN, value=NaN, clientAuth=NaN)
   [junit4]   2> 3004276 INFO  
(SUITE-OverseerTest-seed#[A772547C3688179A]-worker) [] o.a.s.SolrTestCaseJ4 
SecureRandom sanity checks: test.solr.allowed.securerandom=null & 
java.security.egd=file:/dev/./urandom
   [junit4]   2> 3004277 INFO  
(SUITE-OverseerTest-seed#[A772547C3688179A]-worker) [] o.a.s.c.ZkTestServer 
STARTING ZK TEST SERVER
   [junit4]   2> 3004311 INFO  (ZkTestServer Run Thread) [] 
o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 3004311 INFO  (ZkTestServer Run Thread) [] 
o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 3004411 INFO  
(SUITE-OverseerTest-seed#[A772547C3688179A]-worker) [] o.a.s.c.ZkTestServer 
start zk server on port:33408
   [junit4]   2> 3004411 INFO  
(SUITE-OverseerTest-seed#[A772547C3688179A]-worker) [] o.a.s.c.ZkTestServer 
parse host and port list: 127.0.0.1:33408
   [junit4]   2> 3004411 INFO  
(SUITE-OverseerTest-seed#[A772547C3688179A]-worker) [] o.a.s.c.ZkTestServer 
connecting to 127.0.0.1 33408
   [junit4]   2> 3004414 INFO  (zkConnectionManagerCallback-10095-thread-1) [   
 ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 3004416 INFO  (zkConnectionManagerCallback-10097-thread-1) [   
 ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 3004416 INFO  
(SUITE-OverseerTest-seed#[A772547C3688179A]-worker) [] o.a.s.SolrTestCaseJ4 
initCore
   [junit4]   2> 3004416 INFO  
(SUITE-OverseerTest-seed#[A772547C3688179A]-worker) [] o.a.s.SolrTestCaseJ4 
initCore end
   [junit4]   2> 3004421 INFO  
(TEST-OverseerTest.testLatchWatcher-seed#[A772547C3688179A]) [] 
o.a.s.SolrTestCaseJ4 ###Starting testLatchWatcher
   [junit4]   2> 3004524 INFO  
(TEST-OverseerTest.testLatchWatcher-seed#[A772547C3688179A]) [] 
o.a.s.SolrTestCaseJ4 ###Ending testLatchWatcher
   [junit4]   2> 3004528 INFO  
(TEST-OverseerTest.testStateChange-seed#[A772547C3688179A]) [] 
o.a.s.SolrTestCaseJ4 ###Starting testStateChange
   [junit4]   2> 3004684 INFO  (zkConnectionManagerCallback-10103-thread-1) [   
 ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 3004702 INFO  (zkConnectionManagerCallback-10109-thread-1) [   
 ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 3004734 INFO  (zkConnectionManagerCallback-10114-thread-1) [   
 ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 3004748 INFO  
(TEST-OverseerTest.testStateChange-seed#[A772547C3688179A]) [] 
o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:33408/solr ready
   [junit4]   2> 3004785 INFO  
(TEST-OverseerTest.testStateChange-seed#[A772547C3688179A]) [] 
o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:33408_solr
   [junit4]   2> 3004785 INFO  
(TEST-OverseerTest.testStateChange-seed#[A772547C3688179A]) [] 
o.a.s.c.Overseer Overseer 
(id=74718723519610882-127.0.0.1:33408_solr-n_00) starting
   [junit4]   2> 3004819 INFO  
(OverseerStateUpdate-74718723519610882-127.0.0.1:33408_solr-n_00) [
] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:33408_solr
   [junit4]   2> 3004835 DEBUG 
(OverseerStateUpdate-74718723519610882-127.0.0.1:33408_solr-n_00) [
] o.a.s.c.Overseer processMessage: queueSize: 2, message = {
   [junit4]   2>   "operation":"create",
   [junit4]   2>   "name":"collection1",
   [junit4]   2>   "replicationFactor":"1",
   [junit4]   2>   "numShards":"1",

[jira] [Commented] (LUCENE-4012) Make all query classes serializable, and provide a query parser to consume them

2019-05-19 Thread Mike Sokolov (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-4012?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16843445#comment-16843445
 ] 

Mike Sokolov commented on LUCENE-4012:
--

I want to hijack this issue to be about maing Query serializable by any means 
necessary.  The idea of using Jackson seemed like it could be problematic since 
it tends to expose implementation details (constructor signatures, eg), but the 
idea of query serialization is powerful, and we should have it in our bag of 
tricks. A whole class of optimizations stems from analysis of query logs, and 
in order to treat queries as data we need a persistent form for them (not just 
in-memory java Query objects).

It seems like we have a good angle of attack since LUCENE-3041 landed, adding a 
QueryVisitor. My thought is that each query parser could potentially come with 
a serializer that serializes queries into its language, since not every parser 
can represent every query type. Or maybe XML query parser is truly general and 
handles everything thus there is no need for any other flavor? I'm not sure 
though I seem to recall it has some gaps as well.

I worked up a POC that serializes combinations of Boolean and TermQuery into a 
form that is parseable by classic query parser, and I think it can be extended 
pretty easily to cover most query types. I have a question here: to get it to 
work it seemed as if I needed to make BooleanQuery.visit call getSubVisitor for 
every clause (rather than once for each occur-value). This broke a single test 
though in TestQueryVisitor that asserts something about the sequence of these 
calls, and I'm not sure if that assertion is an invariant of the QueryVisitor 
contract, or whether it is simply a byproduct of the implementation. 
[~romseygeek] can you shed some light?  I can post a WIP PR if that would help 
clarify.

> Make all query classes serializable, and provide a query parser to consume 
> them
> ---
>
> Key: LUCENE-4012
> URL: https://issues.apache.org/jira/browse/LUCENE-4012
> Project: Lucene - Core
>  Issue Type: New Feature
>  Components: core/queryparser
>Affects Versions: 4.0-ALPHA
>Reporter: Benson Margulies
>Priority: Major
> Attachments: bq.patch
>
>
> I started off on LUCENE-4004 wanting to use DisjunctionMaxQuery via a parser. 
> However, this wasn't really because I thought that human beans should be 
> improvisationally  composing such thing. My real goal was to concoct a query 
> tree over *here*, and then serialize it to send to Solr over *there*. 
> It occurs to me that if the Xml parser is pretty good for this, JSON would be 
> better. It further occurs to me that the query classes may already all work 
> with Jackson, and, if they don't, the required tweaks will be quite small. By 
> allowing Jackson to write out class names as needed, you get the ability to 
> serialize *any* query, so long as the other side has the classes in class 
> path. A trifle verbose, but not as verbose as XML, and furthermore squishable 
> (though not in a URL) via SMILE or BSON.
> So, the goal of this JIRA is to accumulate tweaks to the query classes to 
> make them more 'bean pattern'. An alternative would be Jackson annotations. 
> However, I suspect that folks would be happier to minimize the level of 
> coupling here; in the extreme, the trivial parser could live in contrib if no 
> one wants a dependency, even optional, on Jackson itself.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (LUCENE-4012) Make all query classes serializable, and provide a query parser to consume them

2019-05-19 Thread Mike Sokolov (JIRA)


 [ 
https://issues.apache.org/jira/browse/LUCENE-4012?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mike Sokolov updated LUCENE-4012:
-
Summary: Make all query classes serializable, and provide a query parser to 
consume them  (was: Make all query classes serializable with Jackson, and 
provide a trivial query parser to consume them)

> Make all query classes serializable, and provide a query parser to consume 
> them
> ---
>
> Key: LUCENE-4012
> URL: https://issues.apache.org/jira/browse/LUCENE-4012
> Project: Lucene - Core
>  Issue Type: New Feature
>  Components: core/queryparser
>Affects Versions: 4.0-ALPHA
>Reporter: Benson Margulies
>Priority: Major
> Attachments: bq.patch
>
>
> I started off on LUCENE-4004 wanting to use DisjunctionMaxQuery via a parser. 
> However, this wasn't really because I thought that human beans should be 
> improvisationally  composing such thing. My real goal was to concoct a query 
> tree over *here*, and then serialize it to send to Solr over *there*. 
> It occurs to me that if the Xml parser is pretty good for this, JSON would be 
> better. It further occurs to me that the query classes may already all work 
> with Jackson, and, if they don't, the required tweaks will be quite small. By 
> allowing Jackson to write out class names as needed, you get the ability to 
> serialize *any* query, so long as the other side has the classes in class 
> path. A trifle verbose, but not as verbose as XML, and furthermore squishable 
> (though not in a URL) via SMILE or BSON.
> So, the goal of this JIRA is to accumulate tweaks to the query classes to 
> make them more 'bean pattern'. An alternative would be Jackson annotations. 
> However, I suspect that folks would be happier to minimize the level of 
> coupling here; in the extreme, the trivial parser could live in contrib if no 
> one wants a dependency, even optional, on Jackson itself.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-8.x-Solaris (64bit/jdk1.8.0) - Build # 134 - Still Failing!

2019-05-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.x-Solaris/134/
Java: 64bit/jdk1.8.0 -XX:-UseCompressedOops -XX:+UseConcMarkSweepGC

All tests passed

Build Log:
[...truncated 65695 lines...]
[asciidoctor:convert] asciidoctor: ERROR: about-this-guide.adoc: line 1: 
invalid part, must have at least one section (e.g., chapter, appendix, etc.)
[asciidoctor:convert] asciidoctor: ERROR: solr-glossary.adoc: line 1: invalid 
part, must have at least one section (e.g., chapter, appendix, etc.)
 [java] Processed 2531 links (2070 relative) to 3359 anchors in 253 files
 [echo] Validated Links & Anchors via: 
/export/home/jenkins/workspace/Lucene-Solr-8.x-Solaris/solr/build/solr-ref-guide/bare-bones-html/

-documentation-lint:
[jtidy] Checking for broken html (such as invalid tags)...
   [delete] Deleting directory 
/export/home/jenkins/workspace/Lucene-Solr-8.x-Solaris/lucene/build/jtidy_tmp
 [echo] Checking for broken links...
 [exec] 
 [exec] Crawl/parse...
 [exec] 
 [exec] Verify...
 [echo] Checking for malformed docs...
 [exec] 
 [exec] 
/export/home/jenkins/workspace/Lucene-Solr-8.x-Solaris/solr/build/docs/solr-solrj/overview-summary.html
 [exec]   missing description: org.noggit
 [exec] 
 [exec] Missing javadocs were found!

BUILD FAILED
/export/home/jenkins/workspace/Lucene-Solr-8.x-Solaris/build.xml:634: The 
following error occurred while executing this line:
/export/home/jenkins/workspace/Lucene-Solr-8.x-Solaris/build.xml:101: The 
following error occurred while executing this line:
/export/home/jenkins/workspace/Lucene-Solr-8.x-Solaris/solr/build.xml:660: The 
following error occurred while executing this line:
/export/home/jenkins/workspace/Lucene-Solr-8.x-Solaris/solr/build.xml:676: The 
following error occurred while executing this line:
/export/home/jenkins/workspace/Lucene-Solr-8.x-Solaris/lucene/common-build.xml:2530:
 exec returned: 1

Total time: 122 minutes 22 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Setting 
ANT_1_8_2_HOME=/export/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Setting 
ANT_1_8_2_HOME=/export/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting 
ANT_1_8_2_HOME=/export/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting 
ANT_1_8_2_HOME=/export/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting 
ANT_1_8_2_HOME=/export/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting 
ANT_1_8_2_HOME=/export/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting 
ANT_1_8_2_HOME=/export/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting 
ANT_1_8_2_HOME=/export/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[jira] [Commented] (LUCENE-8689) Boolean DocValues Codec Implementation

2019-05-19 Thread Dmitry Popov (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8689?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16843408#comment-16843408
 ] 

Dmitry Popov commented on LUCENE-8689:
--

Also I have found that Lucene80 codec and DocValuesNumbersQuery have a 
different performance depending on used compression, dense or sparse.

!dense_vs_sparse_querying.png|width=1120,height=591!

But _Lucene80DocValuesProducer$DenseNumericDocValues_ is activated iff all 
indexed documents have values for a specified field. There is a condition check 
about that in _Lucene80DocValuesConsumer#writeValues()_:
{quote}if (numDocsWithValue == maxDoc) {  // meta[-1, 0]: All documents has 
values
{quote}

> Boolean DocValues Codec Implementation
> --
>
> Key: LUCENE-8689
> URL: https://issues.apache.org/jira/browse/LUCENE-8689
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: core/codecs
>Reporter: Ivan Mamontov
>Priority: Minor
>  Labels: patch, performance
> Attachments: LUCENE-8689.patch, LUCENE-8689.patch, 
> SynteticDocValuesBench70.java, SynteticDocValuesBench80.java, 
> benchmark_dense.png, boolean_vs_dense_vs_sparse_indexing.png, 
> boolean_vs_dense_vs_sparse_updates.png, dense_vs_sparse_querying.png, 
> results2.png
>
>
> To avoid issues where some products become available/unavailable at some 
> point in time after being out-of-stock, e-commerce search system designers 
> need to embed up-to-date information about inventory availability right into 
> the search engines. Key requirement is to be able to accurately filter out 
> unavailable products and use availability as one of ranking signals. However, 
> keeping availability data up-to-date is a non-trivial task. Straightforward 
> implementation based on a partial updates of Lucene documents causes Solr 
> cache trashing with negatively affected query performance and resource 
> utilization.
>  As an alternative solution we can use DocValues and build-in in-place 
> updates where field values can be independently updated without touching 
> inverted index, and while filtering by DocValues is a bit slower, overall 
> performance gain is better. However existing long based docValues are not 
> sufficiently optimized for carrying boolean inventory availability data:
>  * All DocValues queries are internally rewritten into 
> org.apache.lucene.search.DocValuesNumbersQuery which is based on direct 
> iteration over all column values and typically much slower than using 
> TermsQuery.
>  * On every commit/merge codec has to iterate over DocValues a couple times 
> in order to choose the best compression algorithm suitable for given data. As 
> a result for 4K fields and 3M max doc merge takes more than 10 minutes
> This issue is intended to solve these limitations via special bitwise doc 
> values format that uses internal representation of 
> org.apache.lucene.util.FixedBitSet in order to store indexed values and load 
> them at search time as a simple long array without additional decoding. There 
> are several reasons for this:
>  * At index time encoding is super fast without superfluous iterations over 
> all values to choose the best compression algorithm suitable for given data.
>  * At query time decoding is also simple and fast, no GC pressure and extra 
> steps
>  * Internal representation allows to perform random access in constant time
> Limitations are:
>  * Does not support non boolean fields
>  * Boolean fields must be represented as long values 1 for true and 0 for 
> false
>  * Current implementation does not support advanced bit set formats like 
> org.apache.lucene.util.SparseFixedBitSet or 
> org.apache.lucene.util.RoaringDocIdSet
> In order to evaluate performance gain I've wrote a simple JMH based benchmark 
> [^SynteticDocValuesBench70.java] which allows to estimate a relative cost of 
> DF filters. This benchmark creates 2 000 000 documents with 5 boolean columns 
> with different density, where 10, 35, 50, 60 and 90 is an amount of documents 
> with value 1. Each method tries to enumerate over all values in synthetic 
> store field in all available ways:
>  * baseline – in almost all cases Solr uses FixedBitSet in filter cache to 
> keep store availability. This test just iterates over all bits.
>  * docValuesRaw – iterates over all values of DV column, the same code is 
> used in "post filtering", sorting and faceting.
>  * docValuesNumbersQuery – iterates over all values produced by query/filter 
> store:1, actually there is the only query implementation for DV based fields 
> - DocValuesNumbersQuery. This means that Lucene rewrites all term, range and 
> filter queries for non indexed filed into this fallback implementation.
>  * docValuesBooleanQuery – optimized variant of DocValuesNumbersQuery, which 
> support only two v

[jira] [Updated] (LUCENE-8689) Boolean DocValues Codec Implementation

2019-05-19 Thread Dmitry Popov (JIRA)


 [ 
https://issues.apache.org/jira/browse/LUCENE-8689?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dmitry Popov updated LUCENE-8689:
-
Attachment: dense_vs_sparse_querying.png

> Boolean DocValues Codec Implementation
> --
>
> Key: LUCENE-8689
> URL: https://issues.apache.org/jira/browse/LUCENE-8689
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: core/codecs
>Reporter: Ivan Mamontov
>Priority: Minor
>  Labels: patch, performance
> Attachments: LUCENE-8689.patch, LUCENE-8689.patch, 
> SynteticDocValuesBench70.java, SynteticDocValuesBench80.java, 
> benchmark_dense.png, boolean_vs_dense_vs_sparse_indexing.png, 
> boolean_vs_dense_vs_sparse_updates.png, dense_vs_sparse_querying.png, 
> results2.png
>
>
> To avoid issues where some products become available/unavailable at some 
> point in time after being out-of-stock, e-commerce search system designers 
> need to embed up-to-date information about inventory availability right into 
> the search engines. Key requirement is to be able to accurately filter out 
> unavailable products and use availability as one of ranking signals. However, 
> keeping availability data up-to-date is a non-trivial task. Straightforward 
> implementation based on a partial updates of Lucene documents causes Solr 
> cache trashing with negatively affected query performance and resource 
> utilization.
>  As an alternative solution we can use DocValues and build-in in-place 
> updates where field values can be independently updated without touching 
> inverted index, and while filtering by DocValues is a bit slower, overall 
> performance gain is better. However existing long based docValues are not 
> sufficiently optimized for carrying boolean inventory availability data:
>  * All DocValues queries are internally rewritten into 
> org.apache.lucene.search.DocValuesNumbersQuery which is based on direct 
> iteration over all column values and typically much slower than using 
> TermsQuery.
>  * On every commit/merge codec has to iterate over DocValues a couple times 
> in order to choose the best compression algorithm suitable for given data. As 
> a result for 4K fields and 3M max doc merge takes more than 10 minutes
> This issue is intended to solve these limitations via special bitwise doc 
> values format that uses internal representation of 
> org.apache.lucene.util.FixedBitSet in order to store indexed values and load 
> them at search time as a simple long array without additional decoding. There 
> are several reasons for this:
>  * At index time encoding is super fast without superfluous iterations over 
> all values to choose the best compression algorithm suitable for given data.
>  * At query time decoding is also simple and fast, no GC pressure and extra 
> steps
>  * Internal representation allows to perform random access in constant time
> Limitations are:
>  * Does not support non boolean fields
>  * Boolean fields must be represented as long values 1 for true and 0 for 
> false
>  * Current implementation does not support advanced bit set formats like 
> org.apache.lucene.util.SparseFixedBitSet or 
> org.apache.lucene.util.RoaringDocIdSet
> In order to evaluate performance gain I've wrote a simple JMH based benchmark 
> [^SynteticDocValuesBench70.java] which allows to estimate a relative cost of 
> DF filters. This benchmark creates 2 000 000 documents with 5 boolean columns 
> with different density, where 10, 35, 50, 60 and 90 is an amount of documents 
> with value 1. Each method tries to enumerate over all values in synthetic 
> store field in all available ways:
>  * baseline – in almost all cases Solr uses FixedBitSet in filter cache to 
> keep store availability. This test just iterates over all bits.
>  * docValuesRaw – iterates over all values of DV column, the same code is 
> used in "post filtering", sorting and faceting.
>  * docValuesNumbersQuery – iterates over all values produced by query/filter 
> store:1, actually there is the only query implementation for DV based fields 
> - DocValuesNumbersQuery. This means that Lucene rewrites all term, range and 
> filter queries for non indexed filed into this fallback implementation.
>  * docValuesBooleanQuery – optimized variant of DocValuesNumbersQuery, which 
> support only two values – 0/1
> !results2.png!
> Query latency is similar to FixedBitSet with negligible overhead 1-2 ms. 
> DocValuesNumbersQuery 6-7 times slower compared to boolean query. Raw doc 
> values iterator is also not so fast as it performs on-the-fly decoding.
> Attached patch contains two parts:
>  * bitwise codec and all required structures and producers/consumers
>  * boolean query which removes TwoPhaseIterator, AllBits approximation and 
> missing docs lookup
>  * docValues codec test green except non long valu

[jira] [Comment Edited] (LUCENE-8689) Boolean DocValues Codec Implementation

2019-05-19 Thread Dmitry Popov (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8689?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16843387#comment-16843387
 ] 

Dmitry Popov edited comment on LUCENE-8689 at 5/19/19 11:27 AM:


I've updated the patch to make it compatible with current 9.0 version (master). 
We can see the following benchmark results:

!benchmark_dense.png|width=1118,height=647!

Synthetic DocValues benchmark has been updated as well ([link to github 
repo|https://github.com/dmnm/booleanDocValues-jmh-tests]).
 


was (Author: dmitry popov):
I've updated the patch to make it compatible with current 9.0 version (master). 
We can see the following benchmark results:

!benchmark_dense.png|width=1118,height=647!

> Boolean DocValues Codec Implementation
> --
>
> Key: LUCENE-8689
> URL: https://issues.apache.org/jira/browse/LUCENE-8689
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: core/codecs
>Reporter: Ivan Mamontov
>Priority: Minor
>  Labels: patch, performance
> Attachments: LUCENE-8689.patch, LUCENE-8689.patch, 
> SynteticDocValuesBench70.java, SynteticDocValuesBench80.java, 
> benchmark_dense.png, boolean_vs_dense_vs_sparse_indexing.png, 
> boolean_vs_dense_vs_sparse_updates.png, results2.png
>
>
> To avoid issues where some products become available/unavailable at some 
> point in time after being out-of-stock, e-commerce search system designers 
> need to embed up-to-date information about inventory availability right into 
> the search engines. Key requirement is to be able to accurately filter out 
> unavailable products and use availability as one of ranking signals. However, 
> keeping availability data up-to-date is a non-trivial task. Straightforward 
> implementation based on a partial updates of Lucene documents causes Solr 
> cache trashing with negatively affected query performance and resource 
> utilization.
>  As an alternative solution we can use DocValues and build-in in-place 
> updates where field values can be independently updated without touching 
> inverted index, and while filtering by DocValues is a bit slower, overall 
> performance gain is better. However existing long based docValues are not 
> sufficiently optimized for carrying boolean inventory availability data:
>  * All DocValues queries are internally rewritten into 
> org.apache.lucene.search.DocValuesNumbersQuery which is based on direct 
> iteration over all column values and typically much slower than using 
> TermsQuery.
>  * On every commit/merge codec has to iterate over DocValues a couple times 
> in order to choose the best compression algorithm suitable for given data. As 
> a result for 4K fields and 3M max doc merge takes more than 10 minutes
> This issue is intended to solve these limitations via special bitwise doc 
> values format that uses internal representation of 
> org.apache.lucene.util.FixedBitSet in order to store indexed values and load 
> them at search time as a simple long array without additional decoding. There 
> are several reasons for this:
>  * At index time encoding is super fast without superfluous iterations over 
> all values to choose the best compression algorithm suitable for given data.
>  * At query time decoding is also simple and fast, no GC pressure and extra 
> steps
>  * Internal representation allows to perform random access in constant time
> Limitations are:
>  * Does not support non boolean fields
>  * Boolean fields must be represented as long values 1 for true and 0 for 
> false
>  * Current implementation does not support advanced bit set formats like 
> org.apache.lucene.util.SparseFixedBitSet or 
> org.apache.lucene.util.RoaringDocIdSet
> In order to evaluate performance gain I've wrote a simple JMH based benchmark 
> [^SynteticDocValuesBench70.java] which allows to estimate a relative cost of 
> DF filters. This benchmark creates 2 000 000 documents with 5 boolean columns 
> with different density, where 10, 35, 50, 60 and 90 is an amount of documents 
> with value 1. Each method tries to enumerate over all values in synthetic 
> store field in all available ways:
>  * baseline – in almost all cases Solr uses FixedBitSet in filter cache to 
> keep store availability. This test just iterates over all bits.
>  * docValuesRaw – iterates over all values of DV column, the same code is 
> used in "post filtering", sorting and faceting.
>  * docValuesNumbersQuery – iterates over all values produced by query/filter 
> store:1, actually there is the only query implementation for DV based fields 
> - DocValuesNumbersQuery. This means that Lucene rewrites all term, range and 
> filter queries for non indexed filed into this fallback implementation.
>  * docValuesBooleanQuery – optimized variant of DocValuesNumbersQuery, which 

[jira] [Commented] (LUCENE-8689) Boolean DocValues Codec Implementation

2019-05-19 Thread Dmitry Popov (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8689?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16843396#comment-16843396
 ] 

Dmitry Popov commented on LUCENE-8689:
--

Updates: Boolean Codec vs Lucene80 (with different Dense and Sparse 
compressions). There is no such difference:

!boolean_vs_dense_vs_sparse_updates.png|width=1093,height=582!

> Boolean DocValues Codec Implementation
> --
>
> Key: LUCENE-8689
> URL: https://issues.apache.org/jira/browse/LUCENE-8689
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: core/codecs
>Reporter: Ivan Mamontov
>Priority: Minor
>  Labels: patch, performance
> Attachments: LUCENE-8689.patch, LUCENE-8689.patch, 
> SynteticDocValuesBench70.java, SynteticDocValuesBench80.java, 
> benchmark_dense.png, boolean_vs_dense_vs_sparse_indexing.png, 
> boolean_vs_dense_vs_sparse_updates.png, results2.png
>
>
> To avoid issues where some products become available/unavailable at some 
> point in time after being out-of-stock, e-commerce search system designers 
> need to embed up-to-date information about inventory availability right into 
> the search engines. Key requirement is to be able to accurately filter out 
> unavailable products and use availability as one of ranking signals. However, 
> keeping availability data up-to-date is a non-trivial task. Straightforward 
> implementation based on a partial updates of Lucene documents causes Solr 
> cache trashing with negatively affected query performance and resource 
> utilization.
>  As an alternative solution we can use DocValues and build-in in-place 
> updates where field values can be independently updated without touching 
> inverted index, and while filtering by DocValues is a bit slower, overall 
> performance gain is better. However existing long based docValues are not 
> sufficiently optimized for carrying boolean inventory availability data:
>  * All DocValues queries are internally rewritten into 
> org.apache.lucene.search.DocValuesNumbersQuery which is based on direct 
> iteration over all column values and typically much slower than using 
> TermsQuery.
>  * On every commit/merge codec has to iterate over DocValues a couple times 
> in order to choose the best compression algorithm suitable for given data. As 
> a result for 4K fields and 3M max doc merge takes more than 10 minutes
> This issue is intended to solve these limitations via special bitwise doc 
> values format that uses internal representation of 
> org.apache.lucene.util.FixedBitSet in order to store indexed values and load 
> them at search time as a simple long array without additional decoding. There 
> are several reasons for this:
>  * At index time encoding is super fast without superfluous iterations over 
> all values to choose the best compression algorithm suitable for given data.
>  * At query time decoding is also simple and fast, no GC pressure and extra 
> steps
>  * Internal representation allows to perform random access in constant time
> Limitations are:
>  * Does not support non boolean fields
>  * Boolean fields must be represented as long values 1 for true and 0 for 
> false
>  * Current implementation does not support advanced bit set formats like 
> org.apache.lucene.util.SparseFixedBitSet or 
> org.apache.lucene.util.RoaringDocIdSet
> In order to evaluate performance gain I've wrote a simple JMH based benchmark 
> [^SynteticDocValuesBench70.java] which allows to estimate a relative cost of 
> DF filters. This benchmark creates 2 000 000 documents with 5 boolean columns 
> with different density, where 10, 35, 50, 60 and 90 is an amount of documents 
> with value 1. Each method tries to enumerate over all values in synthetic 
> store field in all available ways:
>  * baseline – in almost all cases Solr uses FixedBitSet in filter cache to 
> keep store availability. This test just iterates over all bits.
>  * docValuesRaw – iterates over all values of DV column, the same code is 
> used in "post filtering", sorting and faceting.
>  * docValuesNumbersQuery – iterates over all values produced by query/filter 
> store:1, actually there is the only query implementation for DV based fields 
> - DocValuesNumbersQuery. This means that Lucene rewrites all term, range and 
> filter queries for non indexed filed into this fallback implementation.
>  * docValuesBooleanQuery – optimized variant of DocValuesNumbersQuery, which 
> support only two values – 0/1
> !results2.png!
> Query latency is similar to FixedBitSet with negligible overhead 1-2 ms. 
> DocValuesNumbersQuery 6-7 times slower compared to boolean query. Raw doc 
> values iterator is also not so fast as it performs on-the-fly decoding.
> Attached patch contains two parts:
>  * bitwise codec and all required structures and producers/consu

[jira] [Updated] (LUCENE-8689) Boolean DocValues Codec Implementation

2019-05-19 Thread Dmitry Popov (JIRA)


 [ 
https://issues.apache.org/jira/browse/LUCENE-8689?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dmitry Popov updated LUCENE-8689:
-
Attachment: boolean_vs_dense_vs_sparse_updates.png

> Boolean DocValues Codec Implementation
> --
>
> Key: LUCENE-8689
> URL: https://issues.apache.org/jira/browse/LUCENE-8689
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: core/codecs
>Reporter: Ivan Mamontov
>Priority: Minor
>  Labels: patch, performance
> Attachments: LUCENE-8689.patch, LUCENE-8689.patch, 
> SynteticDocValuesBench70.java, SynteticDocValuesBench80.java, 
> benchmark_dense.png, boolean_vs_dense_vs_sparse_indexing.png, 
> boolean_vs_dense_vs_sparse_updates.png, results2.png
>
>
> To avoid issues where some products become available/unavailable at some 
> point in time after being out-of-stock, e-commerce search system designers 
> need to embed up-to-date information about inventory availability right into 
> the search engines. Key requirement is to be able to accurately filter out 
> unavailable products and use availability as one of ranking signals. However, 
> keeping availability data up-to-date is a non-trivial task. Straightforward 
> implementation based on a partial updates of Lucene documents causes Solr 
> cache trashing with negatively affected query performance and resource 
> utilization.
>  As an alternative solution we can use DocValues and build-in in-place 
> updates where field values can be independently updated without touching 
> inverted index, and while filtering by DocValues is a bit slower, overall 
> performance gain is better. However existing long based docValues are not 
> sufficiently optimized for carrying boolean inventory availability data:
>  * All DocValues queries are internally rewritten into 
> org.apache.lucene.search.DocValuesNumbersQuery which is based on direct 
> iteration over all column values and typically much slower than using 
> TermsQuery.
>  * On every commit/merge codec has to iterate over DocValues a couple times 
> in order to choose the best compression algorithm suitable for given data. As 
> a result for 4K fields and 3M max doc merge takes more than 10 minutes
> This issue is intended to solve these limitations via special bitwise doc 
> values format that uses internal representation of 
> org.apache.lucene.util.FixedBitSet in order to store indexed values and load 
> them at search time as a simple long array without additional decoding. There 
> are several reasons for this:
>  * At index time encoding is super fast without superfluous iterations over 
> all values to choose the best compression algorithm suitable for given data.
>  * At query time decoding is also simple and fast, no GC pressure and extra 
> steps
>  * Internal representation allows to perform random access in constant time
> Limitations are:
>  * Does not support non boolean fields
>  * Boolean fields must be represented as long values 1 for true and 0 for 
> false
>  * Current implementation does not support advanced bit set formats like 
> org.apache.lucene.util.SparseFixedBitSet or 
> org.apache.lucene.util.RoaringDocIdSet
> In order to evaluate performance gain I've wrote a simple JMH based benchmark 
> [^SynteticDocValuesBench70.java] which allows to estimate a relative cost of 
> DF filters. This benchmark creates 2 000 000 documents with 5 boolean columns 
> with different density, where 10, 35, 50, 60 and 90 is an amount of documents 
> with value 1. Each method tries to enumerate over all values in synthetic 
> store field in all available ways:
>  * baseline – in almost all cases Solr uses FixedBitSet in filter cache to 
> keep store availability. This test just iterates over all bits.
>  * docValuesRaw – iterates over all values of DV column, the same code is 
> used in "post filtering", sorting and faceting.
>  * docValuesNumbersQuery – iterates over all values produced by query/filter 
> store:1, actually there is the only query implementation for DV based fields 
> - DocValuesNumbersQuery. This means that Lucene rewrites all term, range and 
> filter queries for non indexed filed into this fallback implementation.
>  * docValuesBooleanQuery – optimized variant of DocValuesNumbersQuery, which 
> support only two values – 0/1
> !results2.png!
> Query latency is similar to FixedBitSet with negligible overhead 1-2 ms. 
> DocValuesNumbersQuery 6-7 times slower compared to boolean query. Raw doc 
> values iterator is also not so fast as it performs on-the-fly decoding.
> Attached patch contains two parts:
>  * bitwise codec and all required structures and producers/consumers
>  * boolean query which removes TwoPhaseIterator, AllBits approximation and 
> missing docs lookup
>  * docValues codec test green except non long values cases



--
This mes

[jira] [Commented] (LUCENE-8689) Boolean DocValues Codec Implementation

2019-05-19 Thread Dmitry Popov (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8689?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16843395#comment-16843395
 ] 

Dmitry Popov commented on LUCENE-8689:
--

Indexing: Boolean Codec vs Lucene80 (with different Dense and Sparse 
compressions):

!boolean_vs_dense_vs_sparse_indexing.png|width=1137,height=606!

We can see that Boolean codec is a bit faster (I guess, due to its "Direct" 
nature).

 

> Boolean DocValues Codec Implementation
> --
>
> Key: LUCENE-8689
> URL: https://issues.apache.org/jira/browse/LUCENE-8689
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: core/codecs
>Reporter: Ivan Mamontov
>Priority: Minor
>  Labels: patch, performance
> Attachments: LUCENE-8689.patch, LUCENE-8689.patch, 
> SynteticDocValuesBench70.java, SynteticDocValuesBench80.java, 
> benchmark_dense.png, boolean_vs_dense_vs_sparse_indexing.png, results2.png
>
>
> To avoid issues where some products become available/unavailable at some 
> point in time after being out-of-stock, e-commerce search system designers 
> need to embed up-to-date information about inventory availability right into 
> the search engines. Key requirement is to be able to accurately filter out 
> unavailable products and use availability as one of ranking signals. However, 
> keeping availability data up-to-date is a non-trivial task. Straightforward 
> implementation based on a partial updates of Lucene documents causes Solr 
> cache trashing with negatively affected query performance and resource 
> utilization.
>  As an alternative solution we can use DocValues and build-in in-place 
> updates where field values can be independently updated without touching 
> inverted index, and while filtering by DocValues is a bit slower, overall 
> performance gain is better. However existing long based docValues are not 
> sufficiently optimized for carrying boolean inventory availability data:
>  * All DocValues queries are internally rewritten into 
> org.apache.lucene.search.DocValuesNumbersQuery which is based on direct 
> iteration over all column values and typically much slower than using 
> TermsQuery.
>  * On every commit/merge codec has to iterate over DocValues a couple times 
> in order to choose the best compression algorithm suitable for given data. As 
> a result for 4K fields and 3M max doc merge takes more than 10 minutes
> This issue is intended to solve these limitations via special bitwise doc 
> values format that uses internal representation of 
> org.apache.lucene.util.FixedBitSet in order to store indexed values and load 
> them at search time as a simple long array without additional decoding. There 
> are several reasons for this:
>  * At index time encoding is super fast without superfluous iterations over 
> all values to choose the best compression algorithm suitable for given data.
>  * At query time decoding is also simple and fast, no GC pressure and extra 
> steps
>  * Internal representation allows to perform random access in constant time
> Limitations are:
>  * Does not support non boolean fields
>  * Boolean fields must be represented as long values 1 for true and 0 for 
> false
>  * Current implementation does not support advanced bit set formats like 
> org.apache.lucene.util.SparseFixedBitSet or 
> org.apache.lucene.util.RoaringDocIdSet
> In order to evaluate performance gain I've wrote a simple JMH based benchmark 
> [^SynteticDocValuesBench70.java] which allows to estimate a relative cost of 
> DF filters. This benchmark creates 2 000 000 documents with 5 boolean columns 
> with different density, where 10, 35, 50, 60 and 90 is an amount of documents 
> with value 1. Each method tries to enumerate over all values in synthetic 
> store field in all available ways:
>  * baseline – in almost all cases Solr uses FixedBitSet in filter cache to 
> keep store availability. This test just iterates over all bits.
>  * docValuesRaw – iterates over all values of DV column, the same code is 
> used in "post filtering", sorting and faceting.
>  * docValuesNumbersQuery – iterates over all values produced by query/filter 
> store:1, actually there is the only query implementation for DV based fields 
> - DocValuesNumbersQuery. This means that Lucene rewrites all term, range and 
> filter queries for non indexed filed into this fallback implementation.
>  * docValuesBooleanQuery – optimized variant of DocValuesNumbersQuery, which 
> support only two values – 0/1
> !results2.png!
> Query latency is similar to FixedBitSet with negligible overhead 1-2 ms. 
> DocValuesNumbersQuery 6-7 times slower compared to boolean query. Raw doc 
> values iterator is also not so fast as it performs on-the-fly decoding.
> Attached patch contains two parts:
>  * bitwise codec and all required structures

[jira] [Updated] (LUCENE-8689) Boolean DocValues Codec Implementation

2019-05-19 Thread Dmitry Popov (JIRA)


 [ 
https://issues.apache.org/jira/browse/LUCENE-8689?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dmitry Popov updated LUCENE-8689:
-
Attachment: boolean_vs_dense_vs_sparse_indexing.png

> Boolean DocValues Codec Implementation
> --
>
> Key: LUCENE-8689
> URL: https://issues.apache.org/jira/browse/LUCENE-8689
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: core/codecs
>Reporter: Ivan Mamontov
>Priority: Minor
>  Labels: patch, performance
> Attachments: LUCENE-8689.patch, LUCENE-8689.patch, 
> SynteticDocValuesBench70.java, SynteticDocValuesBench80.java, 
> benchmark_dense.png, boolean_vs_dense_vs_sparse_indexing.png, results2.png
>
>
> To avoid issues where some products become available/unavailable at some 
> point in time after being out-of-stock, e-commerce search system designers 
> need to embed up-to-date information about inventory availability right into 
> the search engines. Key requirement is to be able to accurately filter out 
> unavailable products and use availability as one of ranking signals. However, 
> keeping availability data up-to-date is a non-trivial task. Straightforward 
> implementation based on a partial updates of Lucene documents causes Solr 
> cache trashing with negatively affected query performance and resource 
> utilization.
>  As an alternative solution we can use DocValues and build-in in-place 
> updates where field values can be independently updated without touching 
> inverted index, and while filtering by DocValues is a bit slower, overall 
> performance gain is better. However existing long based docValues are not 
> sufficiently optimized for carrying boolean inventory availability data:
>  * All DocValues queries are internally rewritten into 
> org.apache.lucene.search.DocValuesNumbersQuery which is based on direct 
> iteration over all column values and typically much slower than using 
> TermsQuery.
>  * On every commit/merge codec has to iterate over DocValues a couple times 
> in order to choose the best compression algorithm suitable for given data. As 
> a result for 4K fields and 3M max doc merge takes more than 10 minutes
> This issue is intended to solve these limitations via special bitwise doc 
> values format that uses internal representation of 
> org.apache.lucene.util.FixedBitSet in order to store indexed values and load 
> them at search time as a simple long array without additional decoding. There 
> are several reasons for this:
>  * At index time encoding is super fast without superfluous iterations over 
> all values to choose the best compression algorithm suitable for given data.
>  * At query time decoding is also simple and fast, no GC pressure and extra 
> steps
>  * Internal representation allows to perform random access in constant time
> Limitations are:
>  * Does not support non boolean fields
>  * Boolean fields must be represented as long values 1 for true and 0 for 
> false
>  * Current implementation does not support advanced bit set formats like 
> org.apache.lucene.util.SparseFixedBitSet or 
> org.apache.lucene.util.RoaringDocIdSet
> In order to evaluate performance gain I've wrote a simple JMH based benchmark 
> [^SynteticDocValuesBench70.java] which allows to estimate a relative cost of 
> DF filters. This benchmark creates 2 000 000 documents with 5 boolean columns 
> with different density, where 10, 35, 50, 60 and 90 is an amount of documents 
> with value 1. Each method tries to enumerate over all values in synthetic 
> store field in all available ways:
>  * baseline – in almost all cases Solr uses FixedBitSet in filter cache to 
> keep store availability. This test just iterates over all bits.
>  * docValuesRaw – iterates over all values of DV column, the same code is 
> used in "post filtering", sorting and faceting.
>  * docValuesNumbersQuery – iterates over all values produced by query/filter 
> store:1, actually there is the only query implementation for DV based fields 
> - DocValuesNumbersQuery. This means that Lucene rewrites all term, range and 
> filter queries for non indexed filed into this fallback implementation.
>  * docValuesBooleanQuery – optimized variant of DocValuesNumbersQuery, which 
> support only two values – 0/1
> !results2.png!
> Query latency is similar to FixedBitSet with negligible overhead 1-2 ms. 
> DocValuesNumbersQuery 6-7 times slower compared to boolean query. Raw doc 
> values iterator is also not so fast as it performs on-the-fly decoding.
> Attached patch contains two parts:
>  * bitwise codec and all required structures and producers/consumers
>  * boolean query which removes TwoPhaseIterator, AllBits approximation and 
> missing docs lookup
>  * docValues codec test green except non long values cases



--
This message was sent by Atlassian JIRA
(v7.6.3#76

[jira] [Commented] (LUCENE-8793) Enhanced UI for CustomAnalyzer : show analysis steps

2019-05-19 Thread Tomoko Uchida (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8793?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16843392#comment-16843392
 ] 

Tomoko Uchida commented on LUCENE-8793:
---

Hi Ohtani san,

thanks for updating the patch, I looked through the code. I'd like to do some 
small layout adjustment and refactoring (e.g. move table model classes from the 
Operator interfaces to the concrete Provider classes) on my local branch before 
committing it to the master, but as a whole it looks fine to me.

[~thetaphi]: Would you please look over the newly added method 
{{o.a.l.luke.models.analysis.AnalysisImpl#analyzeStepByStep(String)}}? There 
are some tricks partially copied from Solr's 
{{o.a.s.handler.AnalysisRequestHandlerBase}} to debug token streams, and I'm 
not confident in pushing the changes as is.

> Enhanced UI for CustomAnalyzer : show analysis steps
> 
>
> Key: LUCENE-8793
> URL: https://issues.apache.org/jira/browse/LUCENE-8793
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: modules/luke
>Reporter: Jun Ohtani
>Assignee: Tomoko Uchida
>Priority: Minor
> Attachments: LUCENE-8793-2.patch, LUCENE-8793.patch, 
> LUCENE-8793.patch, Screen Shot 2019-05-06 at 10.00.57.png, Screen Shot 
> 2019-05-07 at 1.40.47.png, Screenshot from 2019-05-06 13-45-40.png, 
> Screenshot from 2019-05-06 13-46-16.png
>
>
> This is a migrated issue from previous Luke project in GitHub: 
> [https://github.com/DmitryKey/luke/issues/134]
>  
> For on-the-fly inspection / debugging, it is desirable to show the more 
> detailed step by step information in the Custom Analyzer UI.
> This will be just like Solr's Analysis screen,
> [https://lucene.apache.org/solr/guide/7_5/analysis-screen.html]
> or Elasticsearch's {{_analyze}} API and Kibana's Analyzer UI.
> [https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-analyze.html]
> [https://github.com/johtani/analyze-api-ui-plugin]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8804) FieldType attribute map should not be modifiable after freeze

2019-05-19 Thread Lucene/Solr QA (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8804?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16843390#comment-16843390
 ] 

Lucene/Solr QA commented on LUCENE-8804:


| (/) *{color:green}+1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
|| || || || {color:brown} Prechecks {color} ||
| {color:green}+1{color} | {color:green} test4tests {color} | {color:green}  0m 
 0s{color} | {color:green} The patch appears to include 1 new or modified test 
files. {color} |
|| || || || {color:brown} master Compile Tests {color} ||
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  0m 
46s{color} | {color:green} master passed {color} |
|| || || || {color:brown} Patch Compile Tests {color} ||
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  0m 
40s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green}  0m 
41s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} Release audit (RAT) {color} | 
{color:green}  0m 40s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} Check forbidden APIs {color} | 
{color:green}  0m 40s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} Validate source patterns {color} | 
{color:green}  0m 40s{color} | {color:green} the patch passed {color} |
|| || || || {color:brown} Other Tests {color} ||
| {color:green}+1{color} | {color:green} unit {color} | {color:green} 17m 
24s{color} | {color:green} core in the patch passed. {color} |
| {color:black}{color} | {color:black} {color} | {color:black} 21m 29s{color} | 
{color:black} {color} |
\\
\\
|| Subsystem || Report/Notes ||
| JIRA Issue | LUCENE-8804 |
| JIRA Patch URL | 
https://issues.apache.org/jira/secure/attachment/12969070/LUCENE-8804.patch |
| Optional Tests |  compile  javac  unit  ratsources  checkforbiddenapis  
validatesourcepatterns  |
| uname | Linux lucene1-us-west 4.4.0-137-generic #163~14.04.1-Ubuntu SMP Mon 
Sep 24 17:14:57 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | ant |
| Personality | 
/home/jenkins/jenkins-slave/workspace/PreCommit-LUCENE-Build/sourcedir/dev-tools/test-patch/lucene-solr-yetus-personality.sh
 |
| git revision | master / 18cb42e |
| ant | version: Apache Ant(TM) version 1.9.3 compiled on July 24 2018 |
| Default Java | LTS |
|  Test Results | 
https://builds.apache.org/job/PreCommit-LUCENE-Build/186/testReport/ |
| modules | C: lucene lucene/core U: lucene |
| Console output | 
https://builds.apache.org/job/PreCommit-LUCENE-Build/186/console |
| Powered by | Apache Yetus 0.7.0   http://yetus.apache.org |


This message was automatically generated.



> FieldType attribute map should not be modifiable after freeze
> -
>
> Key: LUCENE-8804
> URL: https://issues.apache.org/jira/browse/LUCENE-8804
> Project: Lucene - Core
>  Issue Type: Bug
>  Components: core/index
>Affects Versions: 8.0
>Reporter: Vamshi Vijay Nakkirtha
>Priority: Minor
>  Labels: features, patch
> Attachments: LUCENE-8804.patch
>
>
> Today FieldType attribute map can be modifiable even after freeze. For all 
> other properties of FieldType, we do "checkIfFrozen()" before making the 
> update to the property but for attribute map, we does not seem to make such 
> check. 
>  
> [https://github.com/apache/lucene-solr/blob/releases/lucene-solr/8.0.0/lucene/core/src/java/org/apache/lucene/document/FieldType.java#L363]
> we may need to add check at the beginning of the function similar to other 
> properties setters.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8689) Boolean DocValues Codec Implementation

2019-05-19 Thread Dmitry Popov (JIRA)


[ 
https://issues.apache.org/jira/browse/LUCENE-8689?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16843387#comment-16843387
 ] 

Dmitry Popov commented on LUCENE-8689:
--

I've updated the patch to make it compatible with current 9.0 version (master). 
We can see the following benchmark results:

!benchmark_dense.png|width=1118,height=647!

> Boolean DocValues Codec Implementation
> --
>
> Key: LUCENE-8689
> URL: https://issues.apache.org/jira/browse/LUCENE-8689
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: core/codecs
>Reporter: Ivan Mamontov
>Priority: Minor
>  Labels: patch, performance
> Attachments: LUCENE-8689.patch, LUCENE-8689.patch, 
> SynteticDocValuesBench70.java, SynteticDocValuesBench80.java, 
> benchmark_dense.png, results2.png
>
>
> To avoid issues where some products become available/unavailable at some 
> point in time after being out-of-stock, e-commerce search system designers 
> need to embed up-to-date information about inventory availability right into 
> the search engines. Key requirement is to be able to accurately filter out 
> unavailable products and use availability as one of ranking signals. However, 
> keeping availability data up-to-date is a non-trivial task. Straightforward 
> implementation based on a partial updates of Lucene documents causes Solr 
> cache trashing with negatively affected query performance and resource 
> utilization.
>  As an alternative solution we can use DocValues and build-in in-place 
> updates where field values can be independently updated without touching 
> inverted index, and while filtering by DocValues is a bit slower, overall 
> performance gain is better. However existing long based docValues are not 
> sufficiently optimized for carrying boolean inventory availability data:
>  * All DocValues queries are internally rewritten into 
> org.apache.lucene.search.DocValuesNumbersQuery which is based on direct 
> iteration over all column values and typically much slower than using 
> TermsQuery.
>  * On every commit/merge codec has to iterate over DocValues a couple times 
> in order to choose the best compression algorithm suitable for given data. As 
> a result for 4K fields and 3M max doc merge takes more than 10 minutes
> This issue is intended to solve these limitations via special bitwise doc 
> values format that uses internal representation of 
> org.apache.lucene.util.FixedBitSet in order to store indexed values and load 
> them at search time as a simple long array without additional decoding. There 
> are several reasons for this:
>  * At index time encoding is super fast without superfluous iterations over 
> all values to choose the best compression algorithm suitable for given data.
>  * At query time decoding is also simple and fast, no GC pressure and extra 
> steps
>  * Internal representation allows to perform random access in constant time
> Limitations are:
>  * Does not support non boolean fields
>  * Boolean fields must be represented as long values 1 for true and 0 for 
> false
>  * Current implementation does not support advanced bit set formats like 
> org.apache.lucene.util.SparseFixedBitSet or 
> org.apache.lucene.util.RoaringDocIdSet
> In order to evaluate performance gain I've wrote a simple JMH based benchmark 
> [^SynteticDocValuesBench70.java] which allows to estimate a relative cost of 
> DF filters. This benchmark creates 2 000 000 documents with 5 boolean columns 
> with different density, where 10, 35, 50, 60 and 90 is an amount of documents 
> with value 1. Each method tries to enumerate over all values in synthetic 
> store field in all available ways:
>  * baseline – in almost all cases Solr uses FixedBitSet in filter cache to 
> keep store availability. This test just iterates over all bits.
>  * docValuesRaw – iterates over all values of DV column, the same code is 
> used in "post filtering", sorting and faceting.
>  * docValuesNumbersQuery – iterates over all values produced by query/filter 
> store:1, actually there is the only query implementation for DV based fields 
> - DocValuesNumbersQuery. This means that Lucene rewrites all term, range and 
> filter queries for non indexed filed into this fallback implementation.
>  * docValuesBooleanQuery – optimized variant of DocValuesNumbersQuery, which 
> support only two values – 0/1
> !results2.png!
> Query latency is similar to FixedBitSet with negligible overhead 1-2 ms. 
> DocValuesNumbersQuery 6-7 times slower compared to boolean query. Raw doc 
> values iterator is also not so fast as it performs on-the-fly decoding.
> Attached patch contains two parts:
>  * bitwise codec and all required structures and producers/consumers
>  * boolean query which removes TwoPhaseIterator, AllBits approximation and 
> missin

[JENKINS] Lucene-Solr-repro - Build # 3280 - Unstable

2019-05-19 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-repro/3280/

[...truncated 29 lines...]
[repro] Jenkins log URL: 
https://builds.apache.org/job/Lucene-Solr-BadApples-NightlyTests-8.x/17/consoleText

[repro] Revision: 889cc4fc5c99a94cbb1e23107d8a763963b4dc0b

[repro] Ant options: -Dtests.multiplier=2 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-NightlyTests-8.x/test-data/enwiki.random.lines.txt
[repro] Repro line:  ant test  -Dtestcase=HdfsAutoAddReplicasIntegrationTest 
-Dtests.method=testSimple -Dtests.seed=7B0B922560875242 -Dtests.multiplier=2 
-Dtests.nightly=true -Dtests.slow=true -Dtests.badapples=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-NightlyTests-8.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=bg -Dtests.timezone=America/Bahia -Dtests.asserts=true 
-Dtests.file.encoding=US-ASCII

[repro] Repro line:  ant test  -Dtestcase=RollingRestartTest 
-Dtests.method=test -Dtests.seed=7B0B922560875242 -Dtests.multiplier=2 
-Dtests.nightly=true -Dtests.slow=true -Dtests.badapples=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-NightlyTests-8.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=sr-CS -Dtests.timezone=Brazil/West -Dtests.asserts=true 
-Dtests.file.encoding=US-ASCII

[repro] git rev-parse --abbrev-ref HEAD
[repro] git rev-parse HEAD
[repro] Initial local git branch/revision: 
18cb42ee80854e2159201fe550b13d894425a4f8
[repro] git fetch
[repro] git checkout 889cc4fc5c99a94cbb1e23107d8a763963b4dc0b

[...truncated 2 lines...]
[repro] git merge --ff-only

[...truncated 1 lines...]
[repro] ant clean

[...truncated 6 lines...]
[repro] Test suites by module:
[repro]solr/core
[repro]   RollingRestartTest
[repro]   HdfsAutoAddReplicasIntegrationTest
[repro] ant compile-test

[...truncated 3576 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=10 
-Dtests.class="*.RollingRestartTest|*.HdfsAutoAddReplicasIntegrationTest" 
-Dtests.showOutput=onerror -Dtests.multiplier=2 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-NightlyTests-8.x/test-data/enwiki.random.lines.txt
 -Dtests.seed=7B0B922560875242 -Dtests.multiplier=2 -Dtests.nightly=true 
-Dtests.slow=true -Dtests.badapples=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-NightlyTests-8.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=sr-CS -Dtests.timezone=Brazil/West -Dtests.asserts=true 
-Dtests.file.encoding=US-ASCII

[...truncated 5374 lines...]
[repro] Setting last failure code to 256

[repro] Failures:
[repro]   0/5 failed: org.apache.solr.cloud.RollingRestartTest
[repro]   2/5 failed: 
org.apache.solr.cloud.autoscaling.HdfsAutoAddReplicasIntegrationTest
[repro] git checkout 18cb42ee80854e2159201fe550b13d894425a4f8

[...truncated 2 lines...]
[repro] Exiting with code 256

[...truncated 6 lines...]

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[jira] [Updated] (LUCENE-8689) Boolean DocValues Codec Implementation

2019-05-19 Thread Dmitry Popov (JIRA)


 [ 
https://issues.apache.org/jira/browse/LUCENE-8689?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dmitry Popov updated LUCENE-8689:
-
Attachment: SynteticDocValuesBench80.java
LUCENE-8689.patch
benchmark_dense.png

> Boolean DocValues Codec Implementation
> --
>
> Key: LUCENE-8689
> URL: https://issues.apache.org/jira/browse/LUCENE-8689
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: core/codecs
>Reporter: Ivan Mamontov
>Priority: Minor
>  Labels: patch, performance
> Attachments: LUCENE-8689.patch, LUCENE-8689.patch, 
> SynteticDocValuesBench70.java, SynteticDocValuesBench80.java, 
> benchmark_dense.png, results2.png
>
>
> To avoid issues where some products become available/unavailable at some 
> point in time after being out-of-stock, e-commerce search system designers 
> need to embed up-to-date information about inventory availability right into 
> the search engines. Key requirement is to be able to accurately filter out 
> unavailable products and use availability as one of ranking signals. However, 
> keeping availability data up-to-date is a non-trivial task. Straightforward 
> implementation based on a partial updates of Lucene documents causes Solr 
> cache trashing with negatively affected query performance and resource 
> utilization.
>  As an alternative solution we can use DocValues and build-in in-place 
> updates where field values can be independently updated without touching 
> inverted index, and while filtering by DocValues is a bit slower, overall 
> performance gain is better. However existing long based docValues are not 
> sufficiently optimized for carrying boolean inventory availability data:
>  * All DocValues queries are internally rewritten into 
> org.apache.lucene.search.DocValuesNumbersQuery which is based on direct 
> iteration over all column values and typically much slower than using 
> TermsQuery.
>  * On every commit/merge codec has to iterate over DocValues a couple times 
> in order to choose the best compression algorithm suitable for given data. As 
> a result for 4K fields and 3M max doc merge takes more than 10 minutes
> This issue is intended to solve these limitations via special bitwise doc 
> values format that uses internal representation of 
> org.apache.lucene.util.FixedBitSet in order to store indexed values and load 
> them at search time as a simple long array without additional decoding. There 
> are several reasons for this:
>  * At index time encoding is super fast without superfluous iterations over 
> all values to choose the best compression algorithm suitable for given data.
>  * At query time decoding is also simple and fast, no GC pressure and extra 
> steps
>  * Internal representation allows to perform random access in constant time
> Limitations are:
>  * Does not support non boolean fields
>  * Boolean fields must be represented as long values 1 for true and 0 for 
> false
>  * Current implementation does not support advanced bit set formats like 
> org.apache.lucene.util.SparseFixedBitSet or 
> org.apache.lucene.util.RoaringDocIdSet
> In order to evaluate performance gain I've wrote a simple JMH based benchmark 
> [^SynteticDocValuesBench70.java] which allows to estimate a relative cost of 
> DF filters. This benchmark creates 2 000 000 documents with 5 boolean columns 
> with different density, where 10, 35, 50, 60 and 90 is an amount of documents 
> with value 1. Each method tries to enumerate over all values in synthetic 
> store field in all available ways:
>  * baseline – in almost all cases Solr uses FixedBitSet in filter cache to 
> keep store availability. This test just iterates over all bits.
>  * docValuesRaw – iterates over all values of DV column, the same code is 
> used in "post filtering", sorting and faceting.
>  * docValuesNumbersQuery – iterates over all values produced by query/filter 
> store:1, actually there is the only query implementation for DV based fields 
> - DocValuesNumbersQuery. This means that Lucene rewrites all term, range and 
> filter queries for non indexed filed into this fallback implementation.
>  * docValuesBooleanQuery – optimized variant of DocValuesNumbersQuery, which 
> support only two values – 0/1
> !results2.png!
> Query latency is similar to FixedBitSet with negligible overhead 1-2 ms. 
> DocValuesNumbersQuery 6-7 times slower compared to boolean query. Raw doc 
> values iterator is also not so fast as it performs on-the-fly decoding.
> Attached patch contains two parts:
>  * bitwise codec and all required structures and producers/consumers
>  * boolean query which removes TwoPhaseIterator, AllBits approximation and 
> missing docs lookup
>  * docValues codec test green except non long values cases



--
This message was sent by Atlass

[jira] [Assigned] (LUCENE-8793) Enhanced UI for CustomAnalyzer : show analysis steps

2019-05-19 Thread Tomoko Uchida (JIRA)


 [ 
https://issues.apache.org/jira/browse/LUCENE-8793?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tomoko Uchida reassigned LUCENE-8793:
-

Assignee: Tomoko Uchida

> Enhanced UI for CustomAnalyzer : show analysis steps
> 
>
> Key: LUCENE-8793
> URL: https://issues.apache.org/jira/browse/LUCENE-8793
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: modules/luke
>Reporter: Jun Ohtani
>Assignee: Tomoko Uchida
>Priority: Minor
> Attachments: LUCENE-8793-2.patch, LUCENE-8793.patch, 
> LUCENE-8793.patch, Screen Shot 2019-05-06 at 10.00.57.png, Screen Shot 
> 2019-05-07 at 1.40.47.png, Screenshot from 2019-05-06 13-45-40.png, 
> Screenshot from 2019-05-06 13-46-16.png
>
>
> This is a migrated issue from previous Luke project in GitHub: 
> [https://github.com/DmitryKey/luke/issues/134]
>  
> For on-the-fly inspection / debugging, it is desirable to show the more 
> detailed step by step information in the Custom Analyzer UI.
> This will be just like Solr's Analysis screen,
> [https://lucene.apache.org/solr/guide/7_5/analysis-screen.html]
> or Elasticsearch's {{_analyze}} API and Kibana's Analyzer UI.
> [https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-analyze.html]
> [https://github.com/johtani/analyze-api-ui-plugin]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-BadApples-Tests-8.x - Build # 105 - Still Failing

2019-05-19 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-BadApples-Tests-8.x/105/

1 tests failed.
FAILED:  
org.apache.solr.client.solrj.io.stream.MathExpressionTest.testGammaDistribution

Error Message:
0.8270447626785014 0.8286288686448905

Stack Trace:
java.lang.AssertionError: 0.8270447626785014 0.8286288686448905
at 
__randomizedtesting.SeedInfo.seed([E3F8F49E99795575:DE82DF30BA01FF62]:0)
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.assertTrue(Assert.java:41)
at 
org.apache.solr.client.solrj.io.stream.MathExpressionTest.testGammaDistribution(MathExpressionTest.java:4590)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.lang.Thread.run(Thread.java:748)




Build Log:
[...truncated 16653 lines...]
   [junit4] Suite: org.apache.solr.client.solrj.io.stream.MathExpressionTest
   [junit4]   2> Creating dataDir: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-BadApples-Tests-8.x/solr/build/solr-solrj/test/J0/temp/solr.client.solrj.io.stream.MathExpress

[JENKINS] Lucene-Solr-Tests-8.x - Build # 202 - Still Failing

2019-05-19 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-Tests-8.x/202/

All tests passed

Build Log:
[...truncated 65810 lines...]
[asciidoctor:convert] asciidoctor: ERROR: about-this-guide.adoc: line 1: 
invalid part, must have at least one section (e.g., chapter, appendix, etc.)
[asciidoctor:convert] asciidoctor: ERROR: solr-glossary.adoc: line 1: invalid 
part, must have at least one section (e.g., chapter, appendix, etc.)
 [java] Processed 2531 links (2070 relative) to 3359 anchors in 253 files
 [echo] Validated Links & Anchors via: 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-8.x/solr/build/solr-ref-guide/bare-bones-html/

-documentation-lint:
[jtidy] Checking for broken html (such as invalid tags)...
   [delete] Deleting directory 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-8.x/lucene/build/jtidy_tmp
 [echo] Checking for broken links...
 [exec] 
 [exec] Crawl/parse...
 [exec] 
 [exec] Verify...
 [echo] Checking for malformed docs...
 [exec] 
 [exec] 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-8.x/solr/build/docs/solr-solrj/overview-summary.html
 [exec]   missing description: org.noggit
 [exec] 
 [exec] Missing javadocs were found!

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-8.x/build.xml:634: The 
following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-8.x/build.xml:101: The 
following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-8.x/solr/build.xml:660: 
The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-8.x/solr/build.xml:676: 
The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-8.x/lucene/common-build.xml:2530:
 exec returned: 1

Total time: 228 minutes 52 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[JENKINS] Lucene-Solr-NightlyTests-master - Build # 1851 - Still Unstable

2019-05-19 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-NightlyTests-master/1851/

1 tests failed.
FAILED:  
org.apache.solr.cloud.autoscaling.HdfsAutoAddReplicasIntegrationTest.testSimple

Error Message:
Waiting for collection testSimple2 Timeout waiting to see state for 
collection=testSimple2 
:DocCollection(testSimple2//collections/testSimple2/state.json/23)={   
"pullReplicas":"0",   "replicationFactor":"2",   "shards":{ "shard1":{  
 "range":"8000-",   "state":"active",   "replicas":{
 "core_node3":{   
"dataDir":"hdfs://localhost:39447/solr_hdfs_home/testSimple2/core_node3/data/", 
  "base_url":"http://127.0.0.1:34335/solr";,   
"node_name":"127.0.0.1:34335_solr",   "type":"NRT",   
"force_set_state":"false",   
"ulogDir":"hdfs://localhost:39447/solr_hdfs_home/testSimple2/core_node3/data/tlog",
   "core":"testSimple2_shard1_replica_n1",   
"shared_storage":"true",   "state":"down"}, "core_node5":{  
 
"dataDir":"hdfs://localhost:39447/solr_hdfs_home/testSimple2/core_node5/data/", 
  "base_url":"http://127.0.0.1:33057/solr";,   
"node_name":"127.0.0.1:33057_solr",   "type":"NRT",   
"force_set_state":"false",   
"ulogDir":"hdfs://localhost:39447/solr_hdfs_home/testSimple2/core_node5/data/tlog",
   "core":"testSimple2_shard1_replica_n2",   
"shared_storage":"true",   "state":"active",   
"leader":"true"}}}, "shard2":{   "range":"0-7fff",   
"state":"active",   "replicas":{ "core_node7":{   
"dataDir":"hdfs://localhost:39447/solr_hdfs_home/testSimple2/core_node7/data/", 
  "base_url":"http://127.0.0.1:34335/solr";,   
"node_name":"127.0.0.1:34335_solr",   "type":"NRT",   
"force_set_state":"false",   
"ulogDir":"hdfs://localhost:39447/solr_hdfs_home/testSimple2/core_node7/data/tlog",
   "core":"testSimple2_shard2_replica_n4",   
"shared_storage":"true",   "state":"down"}, "core_node8":{  
 
"dataDir":"hdfs://localhost:39447/solr_hdfs_home/testSimple2/core_node8/data/", 
  "base_url":"http://127.0.0.1:33057/solr";,   
"node_name":"127.0.0.1:33057_solr",   "type":"NRT",   
"force_set_state":"false",   
"ulogDir":"hdfs://localhost:39447/solr_hdfs_home/testSimple2/core_node8/data/tlog",
   "core":"testSimple2_shard2_replica_n6",   
"shared_storage":"true",   "state":"active",   
"leader":"true",   "router":{"name":"compositeId"},   
"maxShardsPerNode":"2",   "autoAddReplicas":"true",   "nrtReplicas":"2",   
"tlogReplicas":"0"} Live Nodes: [127.0.0.1:33057_solr, 127.0.0.1:40740_solr] 
Last available state: 
DocCollection(testSimple2//collections/testSimple2/state.json/23)={   
"pullReplicas":"0",   "replicationFactor":"2",   "shards":{ "shard1":{  
 "range":"8000-",   "state":"active",   "replicas":{
 "core_node3":{   
"dataDir":"hdfs://localhost:39447/solr_hdfs_home/testSimple2/core_node3/data/", 
  "base_url":"http://127.0.0.1:34335/solr";,   
"node_name":"127.0.0.1:34335_solr",   "type":"NRT",   
"force_set_state":"false",   
"ulogDir":"hdfs://localhost:39447/solr_hdfs_home/testSimple2/core_node3/data/tlog",
   "core":"testSimple2_shard1_replica_n1",   
"shared_storage":"true",   "state":"down"}, "core_node5":{  
 
"dataDir":"hdfs://localhost:39447/solr_hdfs_home/testSimple2/core_node5/data/", 
  "base_url":"http://127.0.0.1:33057/solr";,   
"node_name":"127.0.0.1:33057_solr",   "type":"NRT",   
"force_set_state":"false",   
"ulogDir":"hdfs://localhost:39447/solr_hdfs_home/testSimple2/core_node5/data/tlog",
   "core":"testSimple2_shard1_replica_n2",   
"shared_storage":"true",   "state":"active",   
"leader":"true"}}}, "shard2":{   "range":"0-7fff",   
"state":"active",   "replicas":{ "core_node7":{   
"dataDir":"hdfs://localhost:39447/solr_hdfs_home/testSimple2/core_node7/data/", 
  "base_url":"http://127.0.0.1:34335/solr";,   
"node_name":"127.0.0.1:34335_solr",   "type":"NRT",   
"force_set_state":"false",   
"ulogDir":"hdfs://localhost:39447/solr_hdfs_home/testSimple2/core_node7/data/tlog",
   "core":"testSimple2_shard2_replica_n4",   
"shared_storage":"true",   "state":"down"}, "core_node8":{  
 
"dataDir":"hdfs://localhost:39447/solr_hdfs_home/testSimple2/core_node8/data/", 
  "base_url":"http://127.0.0.1:33057/solr";,   
"node_name":"127.0.0.1:33057_solr",   "type":"NRT",   
"force_set_state":"false",   
"ulogDir":"hdfs://localhost:39447/solr_hdfs_home/testSimple2/core_node8/dat

[JENKINS] Lucene-Solr-8.x-Linux (64bit/jdk-11.0.2) - Build # 585 - Unstable!

2019-05-19 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.x-Linux/585/
Java: 64bit/jdk-11.0.2 -XX:+UseCompressedOops -XX:+UseSerialGC

1 tests failed.
FAILED:  org.apache.solr.cloud.HttpPartitionWithTlogReplicasTest.test

Error Message:
Error from server at http://127.0.0.1:41659: Underlying core creation failed 
while creating collection: c8n_1x2

Stack Trace:
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error 
from server at http://127.0.0.1:41659: Underlying core creation failed while 
creating collection: c8n_1x2
at 
__randomizedtesting.SeedInfo.seed([DA151BD6D05D6707:5241240C7EA10AFF]:0)
at 
org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:649)
at 
org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:255)
at 
org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:244)
at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1274)
at 
org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:1792)
at 
org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:1813)
at 
org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:1730)
at 
org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollectionRetry(AbstractFullDistribZkTestBase.java:2042)
at 
org.apache.solr.cloud.HttpPartitionTest.testRf2(HttpPartitionTest.java:214)
at 
org.apache.solr.cloud.HttpPartitionTest.test(HttpPartitionTest.java:135)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1082)
at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1054)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.N

[jira] [Created] (SOLR-13480) Collection creation failure when using Kerberos authentication combined with rule-base authorization

2019-05-19 Thread mosh (JIRA)
mosh created SOLR-13480:
---

 Summary: Collection creation failure when using Kerberos 
authentication combined with rule-base authorization
 Key: SOLR-13480
 URL: https://issues.apache.org/jira/browse/SOLR-13480
 Project: Solr
  Issue Type: Bug
  Security Level: Public (Default Security Level. Issues are Public)
  Components: Authorization, security
Affects Versions: 7.7.1
Reporter: mosh


Creation of collection with an authorized user fails with the following error:
{code:java}
org.apache.solr.common.SolrException: Error getting replica locations : unable 
to get autoscaling policy session{code}
At first it may seem like SOLR-13355 duplication as we are using “all” 
permission, but bug is specific to Kerberos (tested and found ok using basic 
auth) plus we verified the failure with 7.7.2 snapshot that included the 
relevant patch.

+How to reproduce:+
1. Configure solr cloud with kerberos authentication and rule-based 
authorization plugins using the following security.json file:
{code:java}
{
"authentication":{
   "class":"org.apache.solr.security.KerberosPlugin"
},
"authorization":{
   "class":"solr.RuleBasedAuthorizationPlugin",
   "permissions":[
 {
   "name":"read",
   "role":"*"
 },
 {
   "name":"all",
   "role":"admin_user"
 }
   ],
   "user-role":{
 "admin_user@OUR_REALM":"admin_user"
   }
}}{code}
2. Create collection using an authorized user:
{code:java}
kinit admin_user@OUR_REALM

curl --negotiate -u : 
"http:///solr/admin/collections?action=CREATE&name=mycoll&numShards=1&collection.configName=_default"{code}
{color:#d04437}==> request fails with the error written above.{color}

3. Disable authorization by removing _authorization_ section from 
security.json, so file should be as follow:
{code:java}
{
  "authentication":{
    "class":"org.apache.solr.security.KerberosPlugin"
  }
}{code}
4. Create collection again as in step 2.
{color:#14892c}==> request succeeds.{color}

5. Return authorization section to security.json (file from step 1) and make 
sure authorization works as expected by inserting documents and executing 
search queries with different users.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org