[JENKINS] Lucene-Solr-repro - Build # 433 - Unstable

2018-04-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-repro/433/

[...truncated 28 lines...]
[repro] Jenkins log URL: 
https://builds.apache.org/job/Lucene-Solr-SmokeRelease-7.x/191/consoleText

[repro] Revision: 02fa5a0b059353ae5607e91211cea24974257980

[repro] Ant options: -DsmokeTestRelease.java9=/home/jenkins/tools/java/latest1.9
[repro] Repro line:  ant test  -Dtestcase=NodeAddedTriggerTest 
-Dtests.method=testRestoreState -Dtests.seed=5DB514D5480117D4 
-Dtests.multiplier=2 -Dtests.locale=ru -Dtests.timezone=Etc/GMT+6 
-Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=OverseerRolesTest 
-Dtests.seed=5DB514D5480117D4 -Dtests.multiplier=2 -Dtests.locale=ru 
-Dtests.timezone=America/Indiana/Petersburg -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] git rev-parse --abbrev-ref HEAD
[repro] git rev-parse HEAD
[repro] Initial local git branch/revision: 
60ae7be40786d6f8a5c5c8393875bf986d2b8877
[repro] git fetch
[repro] git checkout 02fa5a0b059353ae5607e91211cea24974257980

[...truncated 2 lines...]
[repro] git merge --ff-only

[...truncated 1 lines...]
[repro] ant clean

[...truncated 6 lines...]
[repro] Test suites by module:
[repro]solr/core
[repro]   NodeAddedTriggerTest
[repro]   OverseerRolesTest
[repro] ant compile-test

[...truncated 3315 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=10 
-Dtests.class="*.NodeAddedTriggerTest|*.OverseerRolesTest" 
-Dtests.showOutput=onerror 
-DsmokeTestRelease.java9=/home/jenkins/tools/java/latest1.9 
-Dtests.seed=5DB514D5480117D4 -Dtests.multiplier=2 -Dtests.locale=ru 
-Dtests.timezone=Etc/GMT+6 -Dtests.asserts=true -Dtests.file.encoding=UTF-8

[...truncated 1806 lines...]
[repro] Setting last failure code to 256

[repro] Failures:
[repro]   0/5 failed: org.apache.solr.cloud.OverseerRolesTest
[repro]   4/5 failed: org.apache.solr.cloud.autoscaling.NodeAddedTriggerTest
[repro] git checkout 60ae7be40786d6f8a5c5c8393875bf986d2b8877

[...truncated 2 lines...]
[repro] Exiting with code 256

[...truncated 5 lines...]

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[JENKINS] Lucene-Solr-BadApples-Tests-7.x - Build # 32 - Still unstable

2018-04-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-BadApples-Tests-7.x/32/

7 tests failed.
FAILED:  org.apache.solr.cloud.CreateRoutedAliasTest.testTimeStampWithMsFails

Error Message:
Error from server at http://127.0.0.1:41009/solr: Collection : 
testV1_2018-04-05_01 is part of alias testV1 remove or modify the alias before 
removing this collection.

Stack Trace:
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error 
from server at http://127.0.0.1:41009/solr: Collection : testV1_2018-04-05_01 
is part of alias testV1 remove or modify the alias before removing this 
collection.
at 
__randomizedtesting.SeedInfo.seed([4F69BB1197F4CD09:1D71E879E85274EA]:0)
at 
org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:643)
at 
org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:255)
at 
org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:244)
at 
org.apache.solr.client.solrj.impl.LBHttpSolrClient.doRequest(LBHttpSolrClient.java:483)
at 
org.apache.solr.client.solrj.impl.LBHttpSolrClient.request(LBHttpSolrClient.java:413)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1106)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:886)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:819)
at 
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:194)
at 
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:211)
at 
org.apache.solr.cloud.MiniSolrCloudCluster.deleteAllCollections(MiniSolrCloudCluster.java:451)
at 
org.apache.solr.cloud.CreateRoutedAliasTest.doBefore(CreateRoutedAliasTest.java:96)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:968)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 

[JENKINS] Lucene-Solr-master-Solaris (64bit/jdk1.8.0) - Build # 1783 - Still Unstable!

2018-04-04 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Solaris/1783/
Java: 64bit/jdk1.8.0 -XX:+UseCompressedOops -XX:+UseSerialGC

1 tests failed.
FAILED:  
org.apache.solr.cloud.TestLeaderInitiatedRecoveryThread.testPublishDownState

Error Message:
expected:<27> but was:<28>

Stack Trace:
java.lang.AssertionError: expected:<27> but was:<28>
at 
__randomizedtesting.SeedInfo.seed([3D887904F615433A:63F5DBFA52080AC7]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.failNotEquals(Assert.java:647)
at org.junit.Assert.assertEquals(Assert.java:128)
at org.junit.Assert.assertEquals(Assert.java:472)
at org.junit.Assert.assertEquals(Assert.java:456)
at 
org.apache.solr.cloud.TestLeaderInitiatedRecoveryThread.testPublishDownState(TestLeaderInitiatedRecoveryThread.java:116)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:993)
at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:968)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 

[jira] [Commented] (SOLR-9640) Support PKI authentication and SSL in standalone-mode master/slave auth with local security.json

2018-04-04 Thread Lucene/Solr QA (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-9640?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426377#comment-16426377
 ] 

Lucene/Solr QA commented on SOLR-9640:
--

| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
| {color:red}-1{color} | {color:red} patch {color} | {color:red}  0m  6s{color} 
| {color:red} SOLR-9640 does not apply to master. Rebase required? Wrong 
Branch? See 
https://wiki.apache.org/solr/HowToContribute#Creating_the_patch_file for help. 
{color} |
\\
\\
|| Subsystem || Report/Notes ||
| JIRA Issue | SOLR-9640 |
| JIRA Patch URL | 
https://issues.apache.org/jira/secure/attachment/12917531/SOLR-9640.patch |
| Console output | 
https://builds.apache.org/job/PreCommit-SOLR-Build/36/console |
| Powered by | Apache Yetus 0.7.0   http://yetus.apache.org |


This message was automatically generated.



> Support PKI authentication and SSL in standalone-mode master/slave auth with 
> local security.json
> 
>
> Key: SOLR-9640
> URL: https://issues.apache.org/jira/browse/SOLR-9640
> Project: Solr
>  Issue Type: New Feature
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: security
>Reporter: Jan Høydahl
>Assignee: Jan Høydahl
>Priority: Major
>  Labels: authentication, pki
> Fix For: 7.4, master (8.0)
>
> Attachments: SOLR-9640.patch, SOLR-9640.patch, SOLR-9640.patch, 
> SOLR-9640.patch, SOLR-9640.patch, SOLR-9640.patch
>
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> While working with SOLR-9481 I managed to secure Solr standalone on a 
> single-node server. However, when adding 
> {{=localhost:8081/solr/foo,localhost:8082/solr/foo}} to the request, I 
> get 401 error. This issue will fix PKI auth to work for standalone, which 
> should automatically make both sharding and master/slave index replication 
> work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: TestSSLRandomization is failing everytime

2018-04-04 Thread Joel Bernstein
Nice little program. I got the same result, so the digests are not the
issue:

SHA == SHA

SHA == SHA-1

SHA != SHA-256

SHA != SHA-512

SHA-1 == SHA

SHA-1 == SHA-1

SHA-1 != SHA-256

SHA-1 != SHA-512

SHA-256 != SHA

SHA-256 != SHA-1

SHA-256 == SHA-256

SHA-256 != SHA-512

SHA-512 != SHA

SHA-512 != SHA-1

SHA-512 != SHA-256

SHA-512 == SHA-512

So, I got to thinking if it's not the digest, then the password must be the
problem. So I checked my env and what do you know:

SOLR_SSL_KEY_STORE_PASSWORD=joelbern

I unset this and the test passes. So the issue is if you are using the
environment to test various SSL parameters on a live system, it leaks over
to the test cases. Perhaps we should stamp this out.







Joel Bernstein
http://joelsolr.blogspot.com/

On Wed, Apr 4, 2018 at 7:10 PM, Chris Hostetter 
wrote:

>
> : I have not been able to get the actual SHA implementation (SHA-1,
> : SHA-256...) from the MessageDigest instance. If we could get that, I
> : suspect it would be different on my machine then yours.
>
> How about this...
>
> import java.security.MessageDigest;
> public final class Temp {
>   public static void main(String[] args) throws Exception {
> final byte[] INPUT = "How now brown Cow?".getBytes("UTF-8");
> final String[] ALGOS =  new String[]{"SHA", "SHA-1", "SHA-256",
> "SHA-512"};
> final byte[][] OUTPUT = new byte[ALGOS.length][];
> for (int a = 0; a < ALGOS.length; a++) {
>   final MessageDigest d = MessageDigest.getInstance(ALGOS[a]);
>   d.update(INPUT);
>   OUTPUT[a] = d.digest();
> }
> for (int x = 0; x < ALGOS.length; x++) {
>   for (int y = 0; y < ALGOS.length; y++) {
> System.out.println(ALGOS[x] +
>(MessageDigest.isEqual(OUTPUT[x], OUTPUT[y]) ?
> " == " : " != ") +
>ALGOS[y]);
>   }
> }
>   }
> }
>
>
> hossman@tray:~/tmp$ javac Temp.java
> hossman@tray:~/tmp$ java -ea Temp
> SHA == SHA
> SHA == SHA-1
> SHA != SHA-256
> SHA != SHA-512
> SHA-1 == SHA
> SHA-1 == SHA-1
> SHA-1 != SHA-256
> SHA-1 != SHA-512
> SHA-256 != SHA
> SHA-256 != SHA-1
> SHA-256 == SHA-256
> SHA-256 != SHA-512
> SHA-512 != SHA
> SHA-512 != SHA-1
> SHA-512 != SHA-256
> SHA-512 == SHA-512
>
>
>
>
>
> -Hoss
> http://www.lucidworks.com/
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
> For additional commands, e-mail: dev-h...@lucene.apache.org
>
>


[jira] [Commented] (LUCENE-7935) Release .sha512 hash files with our artifacts

2018-04-04 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-7935?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426364#comment-16426364
 ] 

ASF subversion and git services commented on LUCENE-7935:
-

Commit dced5ae3742a747e96843055ece18a8f34f9b3d0 in lucene-solr's branch 
refs/heads/branch_7x from [~janhoy]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=dced5ae ]

LUCENE-7935: Keep md5/sha1 checksums for maven artifacts

(cherry picked from commit 60ae7be)


> Release .sha512 hash files with our artifacts
> -
>
> Key: LUCENE-7935
> URL: https://issues.apache.org/jira/browse/LUCENE-7935
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: general/build
>Reporter: Jan Høydahl
>Assignee: Jan Høydahl
>Priority: Major
> Fix For: 7.4, master (8.0)
>
> Attachments: LUCENE-7935.patch, LUCENE-7935.patch, 
> LUCENE-7935_smokefail.patch
>
>
> Currently we are only required to release {{.md5}} hashes with our artifacts, 
> and we also include {{.sha1}} files. It is expected that {{.sha512}} will be 
> required in the future (see 
> https://www.apache.org/dev/release-signing.html#sha1), so why not start 
> generating them right away?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-7935) Release .sha512 hash files with our artifacts

2018-04-04 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-7935?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426361#comment-16426361
 ] 

ASF subversion and git services commented on LUCENE-7935:
-

Commit 60ae7be40786d6f8a5c5c8393875bf986d2b8877 in lucene-solr's branch 
refs/heads/master from [~janhoy]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=60ae7be ]

LUCENE-7935: Keep md5/sha1 checksums for maven artifacts


> Release .sha512 hash files with our artifacts
> -
>
> Key: LUCENE-7935
> URL: https://issues.apache.org/jira/browse/LUCENE-7935
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: general/build
>Reporter: Jan Høydahl
>Assignee: Jan Høydahl
>Priority: Major
> Fix For: 7.4, master (8.0)
>
> Attachments: LUCENE-7935.patch, LUCENE-7935.patch, 
> LUCENE-7935_smokefail.patch
>
>
> Currently we are only required to release {{.md5}} hashes with our artifacts, 
> and we also include {{.sha1}} files. It is expected that {{.sha512}} will be 
> required in the future (see 
> https://www.apache.org/dev/release-signing.html#sha1), so why not start 
> generating them right away?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12185) Can't change Single Valued field to Multi Valued even by deleting/readding

2018-04-04 Thread Shawn Heisey (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-12185?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426322#comment-16426322
 ] 

Shawn Heisey commented on SOLR-12185:
-

bq.  Solr can't purge anything when you delete a field, 

Followup on this point:

You can't just delete a field in a Lucene index.  Reindexing every document 
containing that field (and making sure that the field is NOT in the new 
document) would be required.  Reindexing a document involves marking the old 
copy as deleted and indexing a new copy, so you've got the same issue -- old 
versions with that field are STILL in the index, unless you force a merge of 
the entire index to purge deleted documents.

Solr uses the term "optimize" for what Lucene has renamed to forceMerge.  That 
operation can quite literally take hours, and the I/O required can affect 
performance drastically, so it's NOT something you want to happen automatically.


> Can't change Single Valued field to Multi Valued even by deleting/readding
> --
>
> Key: SOLR-12185
> URL: https://issues.apache.org/jira/browse/SOLR-12185
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Schema and Analysis
>Affects Versions: 7.1
>Reporter: Cetra Free
>Priority: Major
>
> Changing a single-valued field to multi-valued field with doc values breaks 
> things.  This doesn't matter if you change the field or do a complete delete 
> and re-add of the field.  The only way I have found to "fix" this is to 
> delete the entire core from disk and re-add it.
> h2. Steps to replicate:
>  * Create a field, make it single valued with doc values
>  * Index a couple of docs
>  * Delete the field
>  * Add the field again with the same name, but change it to multiValued
>  * Try indexing a couple of docs
> h2. Expected result:
> The documents are indexed correctly and there are no issues
> h2. Actual outcome:
> The documents refuse to be indexed and you see this in the logs:
> {code:java}
> org.apache.solr.common.SolrException: Exception writing document id 
> 6a3226c8-c904-40d7-aecb-76c3515db7b8 to the index; possible analysis error: 
> cannot change DocValues type from SORTED to SORTED_SET for field 
> "example_field"
>     at 
> org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:221)
>     at 
> org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:67)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.DistributedUpdateProcessor.doLocalAdd(DistributedUpdateProcessor.java:991)
>     at 
> org.apache.solr.update.processor.DistributedUpdateProcessor.versionAdd(DistributedUpdateProcessor.java:1207)
>     at 
> org.apache.solr.update.processor.DistributedUpdateProcessor.processAdd(DistributedUpdateProcessor.java:753)
>     at 
> org.apache.solr.update.processor.LogUpdateProcessorFactory$LogUpdateProcessor.processAdd(LogUpdateProcessorFactory.java:103)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.AddSchemaFieldsUpdateProcessorFactory$AddSchemaFieldsUpdateProcessor.processAdd(AddSchemaFieldsUpdateProcessorFactory.java:474)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldNameMutatingUpdateProcessorFactory$1.processAdd(FieldNameMutatingUpdateProcessorFactory.java:74)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> 

[jira] [Commented] (SOLR-12185) Can't change Single Valued field to Multi Valued even by deleting/readding

2018-04-04 Thread Shawn Heisey (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-12185?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426311#comment-16426311
 ] 

Shawn Heisey commented on SOLR-12185:
-

bq. It would be nice if you could fully purge a field when you delete it

Changes to the schema make ZERO changes to the Lucene index.  Lucene doesn't 
know about the concept of a schema.  Solr can't purge anything when you delete 
a field, because it NEVER changes the index based on schema changes.

Also, Solr isn't even aware that you've changed the schema.  When you restart 
Solr or reload indexes to make a schema change active, all information about 
the previous schema is gone.  It only knows about the schema that's present 
right at that moment.


> Can't change Single Valued field to Multi Valued even by deleting/readding
> --
>
> Key: SOLR-12185
> URL: https://issues.apache.org/jira/browse/SOLR-12185
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Schema and Analysis
>Affects Versions: 7.1
>Reporter: Cetra Free
>Priority: Major
>
> Changing a single-valued field to multi-valued field with doc values breaks 
> things.  This doesn't matter if you change the field or do a complete delete 
> and re-add of the field.  The only way I have found to "fix" this is to 
> delete the entire core from disk and re-add it.
> h2. Steps to replicate:
>  * Create a field, make it single valued with doc values
>  * Index a couple of docs
>  * Delete the field
>  * Add the field again with the same name, but change it to multiValued
>  * Try indexing a couple of docs
> h2. Expected result:
> The documents are indexed correctly and there are no issues
> h2. Actual outcome:
> The documents refuse to be indexed and you see this in the logs:
> {code:java}
> org.apache.solr.common.SolrException: Exception writing document id 
> 6a3226c8-c904-40d7-aecb-76c3515db7b8 to the index; possible analysis error: 
> cannot change DocValues type from SORTED to SORTED_SET for field 
> "example_field"
>     at 
> org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:221)
>     at 
> org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:67)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.DistributedUpdateProcessor.doLocalAdd(DistributedUpdateProcessor.java:991)
>     at 
> org.apache.solr.update.processor.DistributedUpdateProcessor.versionAdd(DistributedUpdateProcessor.java:1207)
>     at 
> org.apache.solr.update.processor.DistributedUpdateProcessor.processAdd(DistributedUpdateProcessor.java:753)
>     at 
> org.apache.solr.update.processor.LogUpdateProcessorFactory$LogUpdateProcessor.processAdd(LogUpdateProcessorFactory.java:103)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.AddSchemaFieldsUpdateProcessorFactory$AddSchemaFieldsUpdateProcessor.processAdd(AddSchemaFieldsUpdateProcessorFactory.java:474)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldNameMutatingUpdateProcessorFactory$1.processAdd(FieldNameMutatingUpdateProcessorFactory.java:74)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
>     at 
> 

Re: TestSSLRandomization is failing everytime

2018-04-04 Thread Chris Hostetter

: I have not been able to get the actual SHA implementation (SHA-1,
: SHA-256...) from the MessageDigest instance. If we could get that, I
: suspect it would be different on my machine then yours.

How about this...

import java.security.MessageDigest;
public final class Temp {
  public static void main(String[] args) throws Exception {
final byte[] INPUT = "How now brown Cow?".getBytes("UTF-8");
final String[] ALGOS =  new String[]{"SHA", "SHA-1", "SHA-256", 
"SHA-512"};
final byte[][] OUTPUT = new byte[ALGOS.length][];
for (int a = 0; a < ALGOS.length; a++) {
  final MessageDigest d = MessageDigest.getInstance(ALGOS[a]);
  d.update(INPUT);
  OUTPUT[a] = d.digest();
}
for (int x = 0; x < ALGOS.length; x++) {
  for (int y = 0; y < ALGOS.length; y++) {
System.out.println(ALGOS[x] +
   (MessageDigest.isEqual(OUTPUT[x], OUTPUT[y]) ? " == 
" : " != ") +
   ALGOS[y]);
  }
}
  }
}


hossman@tray:~/tmp$ javac Temp.java 
hossman@tray:~/tmp$ java -ea Temp
SHA == SHA
SHA == SHA-1
SHA != SHA-256
SHA != SHA-512
SHA-1 == SHA
SHA-1 == SHA-1
SHA-1 != SHA-256
SHA-1 != SHA-512
SHA-256 != SHA
SHA-256 != SHA-1
SHA-256 == SHA-256
SHA-256 != SHA-512
SHA-512 != SHA
SHA-512 != SHA-1
SHA-512 != SHA-256
SHA-512 == SHA-512





-Hoss
http://www.lucidworks.com/

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: TestSSLRandomization is failing everytime

2018-04-04 Thread Joel Bernstein
I get the same result:

SHA Message Digest from SUN, 


SHA

SUN version 1.8

I have not been able to get the actual SHA implementation (SHA-1,
SHA-256...) from the MessageDigest instance. If we could get that, I
suspect it would be different on my machine then yours.




Joel Bernstein
http://joelsolr.blogspot.com/

On Wed, Apr 4, 2018 at 6:36 PM, Chris Hostetter 
wrote:

> : http://grepcode.com/file_/repository.grepcode.com/java/
> root/jdk/openjdk/8u40-b25/sun/security/provider/
> JavaKeyStore.java/?v=source
> :
> : The getPreKeyedHash method is where MessageDigest.getInstance("SHA") is
> : called. From everything I've read this code is incorrect because SHA is
> not
> : a valid algorithm.
>
> Interesting... what exactly does your JVM produce if you run this code...
>
> public static void main(String[] args) throws Exception {
> java.security.MessageDigest x = java.security.MessageDigest.
> getInstance("SHA");
> System.out.println(x.toString());
> System.out.println(x.getAlgorithm());
> System.out.println(x.getProvider().toString());
> }
>
> On my system i get...
>
> ---
> hossman@tray:~/tmp$ java -ea Temp
> SHA Message Digest from SUN, 
>
> SHA
> SUN version 1.8
> ---
>
> It perplexes me that in the javadocs for MessageDigest the sample usage
> code shows 'MessageDigest.getInstance("SHA");' as the very first line, but
> then lower down it says...
>
> >> Every implementation of the Java platform is required to support the
> >> following standard MessageDigest algorithms:
> >>
> >>  * MD5
> >>  * SHA-1
> >>  * SHA-256
>
> ...and the linked to "Java Cryptography Architecture Standard Algorithm
> Name Documentation" doesn't mention "SHA" but does mention the various
> "SHA-n" specific impls...
>
> https://docs.oracle.com/javase/8/docs/api/java/security/MessageDigest.html
> https://docs.oracle.com/javase/8/docs/technotes/
> guides/security/StandardNames.html#MessageDigest
>
>
>
> -Hoss
> http://www.lucidworks.com/
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
> For additional commands, e-mail: dev-h...@lucene.apache.org
>
>


[jira] [Commented] (SOLR-12185) Can't change Single Valued field to Multi Valued even by deleting/readding

2018-04-04 Thread Cetra Free (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-12185?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426285#comment-16426285
 ] 

Cetra Free commented on SOLR-12185:
---

I've raised this as a ticket to see if it gets traction: 
https://issues.apache.org/jira/browse/LUCENE-8235

It would be nice if you could fully purge a field when you delete it, rather 
than facebook it and just mark it as deleted but keep it around :)

> Can't change Single Valued field to Multi Valued even by deleting/readding
> --
>
> Key: SOLR-12185
> URL: https://issues.apache.org/jira/browse/SOLR-12185
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Schema and Analysis
>Affects Versions: 7.1
>Reporter: Cetra Free
>Priority: Major
>
> Changing a single-valued field to multi-valued field with doc values breaks 
> things.  This doesn't matter if you change the field or do a complete delete 
> and re-add of the field.  The only way I have found to "fix" this is to 
> delete the entire core from disk and re-add it.
> h2. Steps to replicate:
>  * Create a field, make it single valued with doc values
>  * Index a couple of docs
>  * Delete the field
>  * Add the field again with the same name, but change it to multiValued
>  * Try indexing a couple of docs
> h2. Expected result:
> The documents are indexed correctly and there are no issues
> h2. Actual outcome:
> The documents refuse to be indexed and you see this in the logs:
> {code:java}
> org.apache.solr.common.SolrException: Exception writing document id 
> 6a3226c8-c904-40d7-aecb-76c3515db7b8 to the index; possible analysis error: 
> cannot change DocValues type from SORTED to SORTED_SET for field 
> "example_field"
>     at 
> org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:221)
>     at 
> org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:67)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.DistributedUpdateProcessor.doLocalAdd(DistributedUpdateProcessor.java:991)
>     at 
> org.apache.solr.update.processor.DistributedUpdateProcessor.versionAdd(DistributedUpdateProcessor.java:1207)
>     at 
> org.apache.solr.update.processor.DistributedUpdateProcessor.processAdd(DistributedUpdateProcessor.java:753)
>     at 
> org.apache.solr.update.processor.LogUpdateProcessorFactory$LogUpdateProcessor.processAdd(LogUpdateProcessorFactory.java:103)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.AddSchemaFieldsUpdateProcessorFactory$AddSchemaFieldsUpdateProcessor.processAdd(AddSchemaFieldsUpdateProcessorFactory.java:474)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldNameMutatingUpdateProcessorFactory$1.processAdd(FieldNameMutatingUpdateProcessorFactory.java:74)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.AbstractDefaultValueUpdateProcessorFactory$DefaultValueUpdateProcessor.processAdd(AbstractDefaultValueUpdateProcessorFactory.java:91)
>     at 
> org.apache.solr.handler.dataimport.SolrWriter.upload(SolrWriter.java:80)
>     at 
> 

Re: VOTE: Apache Solr Reference Guide for Solr 7.3 RC1

2018-04-04 Thread Karthik Ramachandran
Just saw the 7.3.0 Solr changes page.  In the version section Apache Tika's
version is listed as 1.16, with SOLR-11701 Tika has been upgraded to 1.17.
 Should it be updated to show the latest version?

On Wed, Apr 4, 2018 at 10:48 AM, Cassandra Targett 
wrote:

> Thanks everyone, this vote has passed. I'll start the release process this
> afternoon.
>
> On Tue, Apr 3, 2018 at 7:00 PM, Anshum Gupta 
> wrote:
>
>> +1
>>
>> On Tue, Apr 3, 2018 at 2:25 PM Tomas Fernandez Lobbe 
>> wrote:
>>
>>> +1
>>>
>>>
>>> On Apr 3, 2018, at 12:45 PM, Varun Thacker  wrote:
>>>
>>> +1
>>>
>>> On Tue, Apr 3, 2018 at 10:47 AM, Steve Rowe  wrote:
>>>
 +1

 --
 Steve
 www.lucidworks.com

 > On Apr 3, 2018, at 10:06 AM, Mikhail Khludnev 
 wrote:
 >
 > I've looked through recent changes in PDF. It seems good.
 >
 > On Tue, Apr 3, 2018 at 4:32 PM, Cassandra Targett <
 casstarg...@gmail.com> wrote:
 > Reminder about this.
 >
 > It looks like the Lucene/Solr release vote is going to pass, so we
 could have both released at about the same time.
 >
 > Thanks,
 > Cassandra
 >
 > On Thu, Mar 29, 2018 at 10:49 AM, Cassandra Targett <
 casstarg...@gmail.com> wrote:
 > Please vote to release the Apache Solr Reference Guide for Solr 7.3.
 >
 > The artifacts can be downloaded from:
 > https://dist.apache.org/repos/dist/dev/lucene/solr/ref-guide
 /apache-solr-ref-guide-7.3-RC1/
 >
 > $ cat apache-solr-ref-guide-7.3.pdf.sha1
 > 151f06d920d1ac41564f3c0ddabae3c2c36b6892
 apache-solr-ref-guide-7.3.pdf
 >
 > The HTML version has also been uploaded to the website:
 > https://lucene.apache.org/solr/guide/7_3/
 >
 > Here's my +1.
 >
 > If it happens that this vote passes before the vote for the final
 Lucene/Solr RC is complete, I'll hold release/announcement of the Ref Guide
 until the vote is complete and the release steps are finished.
 >
 > Thanks,
 > Cassandra
 >
 >
 >
 >
 > --
 > Sincerely yours
 > Mikhail Khludnev


 -
 To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
 For additional commands, e-mail: dev-h...@lucene.apache.org


>>>
>>>
>


-- 
With Thanks & Regards
Karthik Ramachandran

P Please don't print this e-mail unless you really need to


Re: TestSSLRandomization is failing everytime

2018-04-04 Thread Chris Hostetter
: 
http://grepcode.com/file_/repository.grepcode.com/java/root/jdk/openjdk/8u40-b25/sun/security/provider/JavaKeyStore.java/?v=source
: 
: The getPreKeyedHash method is where MessageDigest.getInstance("SHA") is
: called. From everything I've read this code is incorrect because SHA is not
: a valid algorithm.

Interesting... what exactly does your JVM produce if you run this code...

public static void main(String[] args) throws Exception {
java.security.MessageDigest x = 
java.security.MessageDigest.getInstance("SHA");
System.out.println(x.toString());
System.out.println(x.getAlgorithm());
System.out.println(x.getProvider().toString());
}

On my system i get...

---
hossman@tray:~/tmp$ java -ea Temp
SHA Message Digest from SUN, 

SHA
SUN version 1.8
---

It perplexes me that in the javadocs for MessageDigest the sample usage 
code shows 'MessageDigest.getInstance("SHA");' as the very first line, but 
then lower down it says...

>> Every implementation of the Java platform is required to support the 
>> following standard MessageDigest algorithms:
>>
>>  * MD5
>>  * SHA-1
>>  * SHA-256

...and the linked to "Java Cryptography Architecture Standard Algorithm 
Name Documentation" doesn't mention "SHA" but does mention the various 
"SHA-n" specific impls...

https://docs.oracle.com/javase/8/docs/api/java/security/MessageDigest.html
https://docs.oracle.com/javase/8/docs/technotes/guides/security/StandardNames.html#MessageDigest



-Hoss
http://www.lucidworks.com/

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-repro - Build # 428 - Unstable

2018-04-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-repro/428/

[...truncated 28 lines...]
[repro] Jenkins log URL: 
https://builds.apache.org/job/Lucene-Solr-BadApples-Tests-7.x/31/consoleText

[repro] Revision: 3c68f3d63769ec1e9c7400a0974837f051046a65

[repro] Repro line:  ant test  -Dtestcase=ComputePlanActionTest 
-Dtests.method=testNodeLost -Dtests.seed=AE5AB9149E50D967 -Dtests.multiplier=2 
-Dtests.slow=true -Dtests.badapples=true -Dtests.locale=ar-IQ 
-Dtests.timezone=America/Cayman -Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestTriggerIntegration 
-Dtests.method=testSearchRate -Dtests.seed=AE5AB9149E50D967 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.badapples=true -Dtests.locale=pl 
-Dtests.timezone=America/Glace_Bay -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestTriggerIntegration 
-Dtests.method=testListeners -Dtests.seed=AE5AB9149E50D967 -Dtests.multiplier=2 
-Dtests.slow=true -Dtests.badapples=true -Dtests.locale=pl 
-Dtests.timezone=America/Glace_Bay -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestTlogReplica 
-Dtests.method=testRecovery -Dtests.seed=AE5AB9149E50D967 -Dtests.multiplier=2 
-Dtests.slow=true -Dtests.badapples=true -Dtests.locale=zh-HK 
-Dtests.timezone=America/Vancouver -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestReplicationHandler 
-Dtests.method=doTestIndexFetchOnMasterRestart -Dtests.seed=AE5AB9149E50D967 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.badapples=true 
-Dtests.locale=pt-BR -Dtests.timezone=Africa/Asmara -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=HttpPartitionTest -Dtests.method=test 
-Dtests.seed=AE5AB9149E50D967 -Dtests.multiplier=2 -Dtests.slow=true 
-Dtests.badapples=true -Dtests.locale=hr -Dtests.timezone=Asia/Kuwait 
-Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] git rev-parse --abbrev-ref HEAD
[repro] git rev-parse HEAD
[repro] Initial local git branch/revision: 
2ace16cef7d329b5a2ac66945838a42e58311916
[repro] git fetch

[...truncated 2 lines...]
[repro] git checkout 3c68f3d63769ec1e9c7400a0974837f051046a65

[...truncated 2 lines...]
[repro] git merge --ff-only

[...truncated 1 lines...]
[repro] ant clean

[...truncated 6 lines...]
[repro] Test suites by module:
[repro]solr/core
[repro]   TestReplicationHandler
[repro]   TestTlogReplica
[repro]   TestTriggerIntegration
[repro]   HttpPartitionTest
[repro]   ComputePlanActionTest
[repro] ant compile-test

[...truncated 3315 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=25 
-Dtests.class="*.TestReplicationHandler|*.TestTlogReplica|*.TestTriggerIntegration|*.HttpPartitionTest|*.ComputePlanActionTest"
 -Dtests.showOutput=onerror  -Dtests.seed=AE5AB9149E50D967 -Dtests.multiplier=2 
-Dtests.slow=true -Dtests.badapples=true -Dtests.locale=pt-BR 
-Dtests.timezone=Africa/Asmara -Dtests.asserts=true -Dtests.file.encoding=UTF-8

[...truncated 3826 lines...]
[repro] Setting last failure code to 256

[repro] Failures:
[repro]   0/5 failed: org.apache.solr.cloud.HttpPartitionTest
[repro]   0/5 failed: org.apache.solr.cloud.TestTlogReplica
[repro]   0/5 failed: org.apache.solr.cloud.autoscaling.ComputePlanActionTest
[repro]   0/5 failed: org.apache.solr.handler.TestReplicationHandler
[repro]   2/5 failed: 
org.apache.solr.cloud.autoscaling.sim.TestTriggerIntegration
[repro] git checkout 2ace16cef7d329b5a2ac66945838a42e58311916

[...truncated 2 lines...]
[repro] Exiting with code 256

[...truncated 6 lines...]

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[jira] [Assigned] (SOLR-11982) Add support for indicating preferred replica types for queries

2018-04-04 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/SOLR-11982?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tomás Fernández Löbbe reassigned SOLR-11982:


Assignee: Tomás Fernández Löbbe

> Add support for indicating preferred replica types for queries
> --
>
> Key: SOLR-11982
> URL: https://issues.apache.org/jira/browse/SOLR-11982
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: SolrCloud
>Affects Versions: 7.4, master (8.0)
>Reporter: Ere Maijala
>Assignee: Tomás Fernández Löbbe
>Priority: Minor
>  Labels: patch-available, patch-with-test
> Attachments: SOLR-11982-preferReplicaTypes.patch, 
> SOLR-11982-preferReplicaTypes.patch, SOLR-11982.patch, SOLR-11982.patch, 
> SOLR-11982.patch, SOLR-11982.patch, SOLR-11982.patch, SOLR-11982.patch, 
> SOLR-11982.patch, SOLR-11982.patch
>
>
> It would be nice to have the possibility to easily sort the shards in the 
> preferred order e.g. by replica type. The attached patch adds support for 
> {{shards.sort}} parameter that allows one to sort e.g. PULL and TLOG replicas 
> first with \{{shards.sort=replicaType:PULL|TLOG }}(which would mean that NRT 
> replicas wouldn't be hit with queries unless they're the only ones available) 
> and/or to sort by replica location (like preferLocalShards=true but more 
> versatile).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-2899) Add OpenNLP Analysis capabilities as a module

2018-04-04 Thread Lance Norskog (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-2899?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426256#comment-16426256
 ] 

Lance Norskog commented on LUCENE-2899:
---

I'm so cheered up that [~steve_rowe] picked this up and added it to Solr!

> Add OpenNLP Analysis capabilities as a module
> -
>
> Key: LUCENE-2899
> URL: https://issues.apache.org/jira/browse/LUCENE-2899
> Project: Lucene - Core
>  Issue Type: New Feature
>  Components: modules/analysis
>Reporter: Grant Ingersoll
>Assignee: Steve Rowe
>Priority: Minor
> Fix For: 7.3, master (8.0)
>
> Attachments: LUCENE-2899-6.1.0.patch, LUCENE-2899-RJN.patch, 
> LUCENE-2899.patch, LUCENE-2899.patch, LUCENE-2899.patch, LUCENE-2899.patch, 
> LUCENE-2899.patch, LUCENE-2899.patch, OpenNLPFilter.java, 
> OpenNLPTokenizer.java
>
>
> Now that OpenNLP is an ASF project and has a nice license, it would be nice 
> to have a submodule (under analysis) that exposed capabilities for it. Drew 
> Farris, Tom Morton and I have code that does:
> * Sentence Detection as a Tokenizer (could also be a TokenFilter, although it 
> would have to change slightly to buffer tokens)
> * NamedEntity recognition as a TokenFilter
> We are also planning a Tokenizer/TokenFilter that can put parts of speech as 
> either payloads (PartOfSpeechAttribute?) on a token or at the same position.
> I'd propose it go under:
> modules/analysis/opennlp



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-2899) Add OpenNLP Analysis capabilities as a module

2018-04-04 Thread Lance Norskog (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-2899?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426251#comment-16426251
 ] 

Lance Norskog commented on LUCENE-2899:
---

The last time I read up on ZK, files are limited to 1mb. The ZK "file system" 
is intended for small configuration files. NLP models can be many megabytes. 
You might need an alternate path (scp) to distribute NLP models. On Windows, 
SMB file sharing.

> Add OpenNLP Analysis capabilities as a module
> -
>
> Key: LUCENE-2899
> URL: https://issues.apache.org/jira/browse/LUCENE-2899
> Project: Lucene - Core
>  Issue Type: New Feature
>  Components: modules/analysis
>Reporter: Grant Ingersoll
>Assignee: Steve Rowe
>Priority: Minor
> Fix For: 7.3, master (8.0)
>
> Attachments: LUCENE-2899-6.1.0.patch, LUCENE-2899-RJN.patch, 
> LUCENE-2899.patch, LUCENE-2899.patch, LUCENE-2899.patch, LUCENE-2899.patch, 
> LUCENE-2899.patch, LUCENE-2899.patch, OpenNLPFilter.java, 
> OpenNLPTokenizer.java
>
>
> Now that OpenNLP is an ASF project and has a nice license, it would be nice 
> to have a submodule (under analysis) that exposed capabilities for it. Drew 
> Farris, Tom Morton and I have code that does:
> * Sentence Detection as a Tokenizer (could also be a TokenFilter, although it 
> would have to change slightly to buffer tokens)
> * NamedEntity recognition as a TokenFilter
> We are also planning a Tokenizer/TokenFilter that can put parts of speech as 
> either payloads (PartOfSpeechAttribute?) on a token or at the same position.
> I'd propose it go under:
> modules/analysis/opennlp



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-12155) Solr 7.2.1 deadlock in UnInvertedField.getUnInvertedField()

2018-04-04 Thread Mikhail Khludnev (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-12155?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mikhail Khludnev updated SOLR-12155:

Attachment: SOLR-12155.patch

> Solr 7.2.1 deadlock in UnInvertedField.getUnInvertedField() 
> 
>
> Key: SOLR-12155
> URL: https://issues.apache.org/jira/browse/SOLR-12155
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>Affects Versions: 7.2.1
>Reporter: Kishor gandham
>Priority: Major
> Attachments: SOLR-12155.patch, SOLR-12155.patch, stack.txt
>
>
> I am attaching a stack trace from our production Solr (7.2.1). Occasionally, 
> we are seeing SOLR becoming unresponsive. We are then forced to kill the JVM 
> and start solr again.
> We have a lot of facet queries and our index has approximately 15 million 
> documents. We have recently started using json.facet queries and some of the 
> facet fields use DocValues.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-12155) Solr 7.2.1 deadlock in UnInvertedField.getUnInvertedField()

2018-04-04 Thread Mikhail Khludnev (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-12155?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mikhail Khludnev updated SOLR-12155:

Attachment: (was: SOLR-12155.patch)

> Solr 7.2.1 deadlock in UnInvertedField.getUnInvertedField() 
> 
>
> Key: SOLR-12155
> URL: https://issues.apache.org/jira/browse/SOLR-12155
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>Affects Versions: 7.2.1
>Reporter: Kishor gandham
>Priority: Major
> Attachments: SOLR-12155.patch, stack.txt
>
>
> I am attaching a stack trace from our production Solr (7.2.1). Occasionally, 
> we are seeing SOLR becoming unresponsive. We are then forced to kill the JVM 
> and start solr again.
> We have a lot of facet queries and our index has approximately 15 million 
> documents. We have recently started using json.facet queries and some of the 
> facet fields use DocValues.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: TestSSLRandomization is failing everytime

2018-04-04 Thread Joel Bernstein
The code:

public static void main(String[] args) throws Exception {
System.out.println(javax.net.ssl.SSLContext.getDefault().get
Provider());
System.out.println(javax.net.ssl.SSLContext.getDefault().get
Protocol());
 }

returns this from the command line:

SunJSSE version 1.8

Default

I believe this won't trip the issue I'm seeing because the error coming
when from the password check, which is not occurring from the command line.
The SSL test cases do cause a password check and that's where the error is
generated when comparing the digests.





Joel Bernstein
http://joelsolr.blogspot.com/

On Wed, Apr 4, 2018 at 5:26 PM, Martin Gainty  wrote:

> the other way to determine your default provider is to dump java.security:
>
> cat $JAVA_HOME/jre/lib/security/java.security
>
>
> hoss suggests you determine the default provider and default protocol with
> this simple test:
>
>  public static void main(String[] args) throws Exception {
> System.out.println(javax.net.ssl.SSLContext.getDefault().get
> Provider());
> System.out.println(javax.net.ssl.SSLContext.getDefault().get
> Protocol());
> }
>
> what do you see for either test?
>
>
> Martin
> __
>
>
>
>
> --
> *From:* Joel Bernstein 
> *Sent:* Wednesday, April 4, 2018 3:35 PM
> *To:* lucene dev
> *Subject:* Re: TestSSLRandomization is failing everytime
>
> The code below ran fine from the command line and from a basic test case:
> System.out.println(javax.net.
> javax.net - This domain may be for sale! 
> javax.net
> {domain} has been informing visitors about topics such as {related1}. Join
> thousands of satisfied visitors who discovered {related2}. This domain may
> be for sale!
>
> ssl.SSLContext.getDefault().getProvider());
> System.out.println(javax.net.ssl.SSLContext.getDefault().getProtocol());
>
> The source code that throws the exception in JavaKeyStore.engineLoad is:
>
> if (password != null) {
> byte computed[], actual[];
> computed = md.digest();
> actual = new byte[computed.length];
> dis.readFully(actual);
> for (int i = 0; i < computed.length; i++) {
> if (computed[i] != actual[i]) {
> Throwable t = new UnrecoverableKeyException
> ("Password verification failed");
> throw (IOException)new IOException
> ("Keystore was tampered with, or "
> + "password was incorrect").initCause(t);
> }
> }
> }
>
>
> Notice that it's simply comparing the bytes from two digests.
>
> The digests are prepared using a SHA digest, notice that it just specifies
> SHA, which must choose the default SHA digest for the system it's on. If
> it choses a different SHA digest the password would not match. My best bet
> right now is that I've changed my default SHA digest to be something
> other then what was used to create the passwords for the test framework:
>
>
> /**
>  * To guard against tampering with the keystore, we append a keyed
>  * hash with a bit of whitener.
>  */
> private MessageDigest getPreKeyedHash(char[] password)
> throws NoSuchAlgorithmException, UnsupportedEncodingException
> {
> int i, j;
>
> MessageDigest md = MessageDigest.getInstance("SHA");
> byte[] passwdBytes = new byte[password.length * 2];
> for (i=0, j=0; i passwdBytes[j++] = (byte)(password[i] >> 8);
> passwdBytes[j++] = (byte)password[i];
> }
> md.update(passwdBytes);
> for (i=0; i passwdBytes[i] = 0;
> md.update("Mighty Aphrodite".getBytes("UTF8"));
> return md;
> }
>
>
>
>
>
>
> Joel Bernstein
> http://joelsolr.blogspot.com/
>
> On Wed, Apr 4, 2018 at 2:11 PM, Joel Bernstein  wrote:
>
> Thanks Hoss, I will give this a try.
>
> Joel Bernstein
> http://joelsolr.blogspot.com/
>
> On Wed, Apr 4, 2018 at 1:42 PM, Chris Hostetter 
> wrote:
>
>
> : I suspect I hosed something to do with my root certs on my local machine.
> : Fairly recently I was playing around with these certs while doing some
> SSL
> : work for Alfresco. This should be fun to fix...
>
> if that's your suspicion, i would start by testing out a simple java app
> that does nothing but...
>
> public static void main(String[] args) throws Exception {
> System.out.println(javax.net.ssl.SSLContext.getDefault().get
> Provider());
> System.out.println(javax.net.ssl.SSLContext.getDefault().get
> Protocol());
> }
>
> ...if thta fails on the commandline, then ou definitely hozed your
> machine.
>
> If that *does* work on the commandline, then 

[jira] [Comment Edited] (LUCENE-7935) Release .sha512 hash files with our artifacts

2018-04-04 Thread JIRA

[ 
https://issues.apache.org/jira/browse/LUCENE-7935?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426248#comment-16426248
 ] 

Jan Høydahl edited comment on LUCENE-7935 at 4/4/18 10:17 PM:
--

Ok, that was what I suspected, as I could not find a way to generate any hashes 
in the maven parts of the build.

Uploaded a patch [^LUCENE-7935_smokefail.patch] which reverts the changes to 
smokeTester's {{verifyMavenDigests(artifacts)}} method, i.e. it will look for 
md5 and sha1 there. Looks good so far, running a full local smoketest and will 
commit if it succeeds.


was (Author: janhoy):
Ok, that was what I suspected, as I could not find a way to generate any hashes 
in the maven parts of the build.

Uploaded a patch LUCENE-7935_smokefail.patch which reverts the changes to 
smokeTester's {{verifyMavenDigests(artifacts)}} method, i.e. it will look for 
md5 and sha1 there. Looks good so far, running a full local smoketest and will 
commit if it succeeds.

> Release .sha512 hash files with our artifacts
> -
>
> Key: LUCENE-7935
> URL: https://issues.apache.org/jira/browse/LUCENE-7935
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: general/build
>Reporter: Jan Høydahl
>Assignee: Jan Høydahl
>Priority: Major
> Fix For: 7.4, master (8.0)
>
> Attachments: LUCENE-7935.patch, LUCENE-7935.patch, 
> LUCENE-7935_smokefail.patch
>
>
> Currently we are only required to release {{.md5}} hashes with our artifacts, 
> and we also include {{.sha1}} files. It is expected that {{.sha512}} will be 
> required in the future (see 
> https://www.apache.org/dev/release-signing.html#sha1), so why not start 
> generating them right away?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-7935) Release .sha512 hash files with our artifacts

2018-04-04 Thread JIRA

[ 
https://issues.apache.org/jira/browse/LUCENE-7935?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426248#comment-16426248
 ] 

Jan Høydahl commented on LUCENE-7935:
-

Ok, that was what I suspected, as I could not find a way to generate any hashes 
in the maven parts of the build.

Uploaded a patch LUCENE-7935_smokefail.patch which reverts the changes to 
smokeTester's {{verifyMavenDigests(artifacts)}} method, i.e. it will look for 
md5 and sha1 there. Looks good so far, running a full local smoketest and will 
commit if it succeeds.

> Release .sha512 hash files with our artifacts
> -
>
> Key: LUCENE-7935
> URL: https://issues.apache.org/jira/browse/LUCENE-7935
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: general/build
>Reporter: Jan Høydahl
>Assignee: Jan Høydahl
>Priority: Major
> Fix For: 7.4, master (8.0)
>
> Attachments: LUCENE-7935.patch, LUCENE-7935.patch, 
> LUCENE-7935_smokefail.patch
>
>
> Currently we are only required to release {{.md5}} hashes with our artifacts, 
> and we also include {{.sha1}} files. It is expected that {{.sha512}} will be 
> required in the future (see 
> https://www.apache.org/dev/release-signing.html#sha1), so why not start 
> generating them right away?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-Tests-master - Build # 2471 - Unstable

2018-04-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-Tests-master/2471/

1 tests failed.
FAILED:  org.apache.solr.handler.admin.AutoscalingHistoryHandlerTest.testHistory

Error Message:
expected:<8> but was:<10>

Stack Trace:
java.lang.AssertionError: expected:<8> but was:<10>
at 
__randomizedtesting.SeedInfo.seed([9929ADB97916BF7C:F4D50944C35E407B]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.failNotEquals(Assert.java:647)
at org.junit.Assert.assertEquals(Assert.java:128)
at org.junit.Assert.assertEquals(Assert.java:472)
at org.junit.Assert.assertEquals(Assert.java:456)
at 
org.apache.solr.handler.admin.AutoscalingHistoryHandlerTest.testHistory(AutoscalingHistoryHandlerTest.java:355)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.lang.Thread.run(Thread.java:748)




Build Log:
[...truncated 1897 lines...]
   [junit4] JVM J0: stdout was not empty, see: 

[jira] [Updated] (LUCENE-7935) Release .sha512 hash files with our artifacts

2018-04-04 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/LUCENE-7935?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jan Høydahl updated LUCENE-7935:

Attachment: LUCENE-7935_smokefail.patch

> Release .sha512 hash files with our artifacts
> -
>
> Key: LUCENE-7935
> URL: https://issues.apache.org/jira/browse/LUCENE-7935
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: general/build
>Reporter: Jan Høydahl
>Assignee: Jan Høydahl
>Priority: Major
> Fix For: 7.4, master (8.0)
>
> Attachments: LUCENE-7935.patch, LUCENE-7935.patch, 
> LUCENE-7935_smokefail.patch
>
>
> Currently we are only required to release {{.md5}} hashes with our artifacts, 
> and we also include {{.sha1}} files. It is expected that {{.sha512}} will be 
> required in the future (see 
> https://www.apache.org/dev/release-signing.html#sha1), so why not start 
> generating them right away?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-7935) Release .sha512 hash files with our artifacts

2018-04-04 Thread Steve Rowe (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-7935?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426238#comment-16426238
 ] 

Steve Rowe commented on LUCENE-7935:


{quote}I suppose we want to discontinue md5 also for all maven artifacts, or is 
it required?
{quote}
This page, which lists requirements for Maven Central hosting, doesn't mention 
md5 or any other hash; only PGP signatures are mentioned: 
[http://central.sonatype.org/pages/requirements.html]

The Maven distribution section on the ASF page on publishing releases 
([http://www.apache.org/dev/release-publishing.html#distribution_maven]) says:
{quote}don't try to publish .sha256, .sha512 files yet; Nexus doesn't handle 
them (as of March 2018)
{quote}
The ASF page on publishing Maven artifacts 
([http://www.apache.org/dev/publishing-maven-artifacts.html]) says we don't 
need to provide MD5 or SHA1 files:
{quote}Nexus will create MD5 and SHA1 checksums on the fly
{quote}

> Release .sha512 hash files with our artifacts
> -
>
> Key: LUCENE-7935
> URL: https://issues.apache.org/jira/browse/LUCENE-7935
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: general/build
>Reporter: Jan Høydahl
>Assignee: Jan Høydahl
>Priority: Major
> Fix For: 7.4, master (8.0)
>
> Attachments: LUCENE-7935.patch, LUCENE-7935.patch
>
>
> Currently we are only required to release {{.md5}} hashes with our artifacts, 
> and we also include {{.sha1}} files. It is expected that {{.sha512}} will be 
> required in the future (see 
> https://www.apache.org/dev/release-signing.html#sha1), so why not start 
> generating them right away?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-7960) NGram filters -- add option to keep short terms

2018-04-04 Thread Ingomar Wesp (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-7960?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426221#comment-16426221
 ] 

Ingomar Wesp commented on LUCENE-7960:
--

Ok, I just added the same paramteters to the NGramTokenFilter and updated the 
pull request. In the long term, it probably makes sense to move all the logic 
into the NGramTokenFilter and turn EdgeNGramTokenFilter into a simple wrapper. 
EdgeNGramTokenizer is already implemented this way.

I presume it also makes sense to extend NGramTokenizer and EdgeNGramTokenizer 
accordingly?

> NGram filters -- add option to keep short terms
> ---
>
> Key: LUCENE-7960
> URL: https://issues.apache.org/jira/browse/LUCENE-7960
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: modules/analysis
>Reporter: Shawn Heisey
>Priority: Major
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> When ngram or edgengram filters are used, any terms that are shorter than the 
> minGramSize are completely removed from the token stream.
> This is probably 100% what was intended, but I've seen it cause a lot of 
> problems for users.  I am not suggesting that the default behavior be 
> changed.  That would be far too disruptive to the existing user base.
> I do think there should be a new boolean option, with a name like 
> keepShortTerms, that defaults to false, to allow the short terms to be 
> preserved.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: TestSSLRandomization is failing everytime

2018-04-04 Thread Martin Gainty
the other way to determine your default provider is to dump java.security:

cat $JAVA_HOME/jre/lib/security/java.security


hoss suggests you determine the default provider and default protocol with this 
simple test:

 public static void main(String[] args) throws Exception {

System.out.println(javax.net.ssl.SSLContext.getDefault().getProvider());

System.out.println(javax.net.ssl.SSLContext.getDefault().getProtocol());
}


what do you see for either test?


Martin
__




From: Joel Bernstein 
Sent: Wednesday, April 4, 2018 3:35 PM
To: lucene dev
Subject: Re: TestSSLRandomization is failing everytime

The code below ran fine from the command line and from a basic test case:
System.out.println(javax.net.
javax.net - This domain may be for sale!
javax.net
{domain} has been informing visitors about topics such as {related1}. Join 
thousands of satisfied visitors who discovered {related2}. This domain may be 
for sale!


ssl.SSLContext.getDefault().getProvider());
System.out.println(javax.net.ssl.SSLContext.getDefault().getProtocol());

The source code that throws the exception in JavaKeyStore.engineLoad is:


if (password != null) {
byte computed[], actual[];
computed = md.digest();
actual = new byte[computed.length];
dis.readFully(actual);
for (int i = 0; i < computed.length; i++) {
if (computed[i] != actual[i]) {
Throwable t = new UnrecoverableKeyException
("Password verification failed");
throw (IOException)new IOException
("Keystore was tampered with, or "
+ "password was incorrect").initCause(t);
}
}
}

Notice that it's simply comparing the bytes from two digests.

The digests are prepared using a SHA digest, notice that it just specifies SHA, 
which must choose the default SHA digest for the system it's on. If
it choses a different SHA digest the password would not match. My best bet 
right now is that I've changed my default SHA digest to be something
other then what was used to create the passwords for the test framework:



/**
 * To guard against tampering with the keystore, we append a keyed
 * hash with a bit of whitener.
 */
private MessageDigest getPreKeyedHash(char[] password)
throws NoSuchAlgorithmException, UnsupportedEncodingException
{
int i, j;

MessageDigest md = MessageDigest.getInstance("SHA");
byte[] passwdBytes = new byte[password.length * 2];
for (i=0, j=0; i> 8);
passwdBytes[j++] = (byte)password[i];
}
md.update(passwdBytes);
for (i=0; i> wrote:
Thanks Hoss, I will give this a try.

Joel Bernstein
http://joelsolr.blogspot.com/

On Wed, Apr 4, 2018 at 1:42 PM, Chris Hostetter 
> wrote:

: I suspect I hosed something to do with my root certs on my local machine.
: Fairly recently I was playing around with these certs while doing some SSL
: work for Alfresco. This should be fun to fix...

if that's your suspicion, i would start by testing out a simple java app
that does nothing but...

public static void main(String[] args) throws Exception {

System.out.println(javax.net.ssl.SSLContext.getDefault().getProvider());

System.out.println(javax.net.ssl.SSLContext.getDefault().getProtocol());
}

...if thta fails on the commandline, then ou definitely hozed your
machine.

If that *does* work on the commandline, then try the same code in a
trivial Junit test that just subclasses LuceneTestCase -- NOT
SolrTestCaseJ4 -- to see if the problem is somewhere in our ant/lucene
build setup, independent of any Solr SSL randomization.

:
: Joel Bernstein
: http://joelsolr.blogspot.com/
:
: On Wed, Apr 4, 2018 at 12:29 PM, Joel Bernstein 
> wrote:
:
: > Ok, so it does sounds like a local problem then. Nothing much has changed
: > locally. I'm still using the same Mac and Java version:
: >
: > defaultuildsMBP:clone2 joelbernstein$ java -version
: >
: > java version "1.8.0_40"
: >
: > Java(TM) SE Runtime Environment (build 1.8.0_40-b27)
: >
: > Java HotSpot(TM) 64-Bit Server VM (build 25.40-b25, mixed mode)
: >
: > 

[ANNOUNCE] Apache Solr 7.3.0 released

2018-04-04 Thread Alan Woodward
4th April 2018, Apache Solr™ 7.3.0 available

The Lucene PMC is pleased to announce the release of Apache Solr 7.3.0

Solr is the popular, blazing fast, open source NoSQL search platform from
the Apache Lucene project. Its major features include powerful full-text
search, hit highlighting, faceted search and analytics, rich document
parsing, geospatial search, extensive REST APIs as well as parallel SQL.
Solr is enterprise grade, secure and highly scalable, providing fault
tolerant distributed search and indexing, and powers the search and
navigation features of many of the world's largest internet sites.

This release includes the following changes since the 7.2.0 release:

- A new update request processor supports OpenNLP-based entity extraction
and language detection
- Support for automatic time-based collection creation
- Multivalued primitive fields can now be used in sorting
- A new SortableTextField allows both indexing and sorting/faceting on free
text
- Several new stream evaluators
- Improvements around leader-initiated recovery
- New autoscaling features: triggers can perform operations based on any
metric available from the Metrics API, based on a defined schedule, or in
response to a query rate over a 1-minute average. A new screen in the Admin
UI will show suggested autoscaling actions.
- Metrics can now be exported to Prometheus
- {!parent} and {!child} support filtering with exclusions via new local
parameters
- Introducing {!filters} query parser for referencing filter queries and
excluding them
- Support for running Solr with Java 10
- A new contrib/ltr NeuralNetworkModel class

Furthermore, this release includes Apache Lucene 7.3.0 which includes
several changes since the 7.2.0 release

The release is available for immediate download at:

http://www.apache.org/dyn/closer.lua/lucene/solr/7.3.0

Please read CHANGES.txt for a detailed list of changes:

https://lucene.apache.org/solr/7_3_0/changes/Changes.html

Please report any feedback to the mailing lists
(http://lucene.apache.org/solr/discussion.html)

Note: The Apache Software Foundation uses an extensive mirroring network
for distributing releases. It is possible that the mirror you are using may
not have replicated the release yet. If that is the case, please try
another mirror. This also goes for Maven access.


[ANNOUNCE] Apache Lucene 7.3.0 released

2018-04-04 Thread Alan Woodward
4 April 2018, Apache Lucene™ 7.3.0 available

The Lucene PMC is pleased to announce the release of Apache Lucene 7.3.0

Apache Lucene is a high-performance, full-featured text search engine
library
 written entirely in Java. It is a technology suitable for nearly any
application
 that requires full-text search, especially cross-platform.

This release contains numerous bug fixes, optimizations, and improvements,
some
 of which are highlighted below. The release is available for immediate
download at:

  

Lucene 7.3.0 changes include:
- Performance improvements when running on Java 9 or later
- A new OpenNLP analysis module allows tokenization, part-of-speech
tagging, lemmatization and phrase chunking using OpenNLP tools.
- Shapes may now be indexed using Google S2 geometry
- IndexWriter can opt out of flushing on indexing threads
- Better relevancy for highlight passages containing phrases
- Confirmed support for Java 10

Please read CHANGES.txt for a full list of changes:

  

Please report any feedback to the mailing lists
(http://lucene.apache.org/core/discussion.html)

Note: The Apache Software Foundation uses an extensive mirroring network
for distributing releases. It is possible that the mirror you are using may
not have replicated the release yet. If that is the case, please try
another mirror. This also applies to Maven access.


[jira] [Commented] (SOLR-9241) Rebalance API for SolrCloud

2018-04-04 Thread Nitin Sharma (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-9241?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426189#comment-16426189
 ] 

Nitin Sharma commented on SOLR-9241:


Sorry i haven't kept this patch up with the latest solr releases. Last was 6.1. 
 [~noble.paul] There is a basic version of resharding in this patch but it has 
not been ported yet. It does reduce/increase number of shards based on the 
following commands. 

 

Auto Shard - Dynamically shard a collection to any size.
Smart Merge - Distributed Mode - Helps merging data from a larger shard setup 
into smaller one. (the source should be divisible by destination)

> Rebalance API for SolrCloud
> ---
>
> Key: SOLR-9241
> URL: https://issues.apache.org/jira/browse/SOLR-9241
> Project: Solr
>  Issue Type: New Feature
>  Components: SolrCloud
>Affects Versions: 6.1
> Environment: Ubuntu, Mac OsX
>Reporter: Nitin Sharma
>Priority: Major
>  Labels: Cluster, SolrCloud
> Fix For: 6.1
>
> Attachments: Redistribute_After.jpeg, Redistribute_Before.jpeg, 
> Redistribute_call.jpeg, Replace_After.jpeg, Replace_Before.jpeg, 
> Replace_Call.jpeg, SOLR-9241-4.6.patch, SOLR-9241-6.1.patch
>
>   Original Estimate: 2,016h
>  Remaining Estimate: 2,016h
>
> This is the v1 of the patch for Solrcloud Rebalance api (as described in 
> http://engineering.bloomreach.com/solrcloud-rebalance-api/) , built at 
> Bloomreach by Nitin Sharma and Suruchi Shah. The goal of the API  is to 
> provide a zero downtime mechanism to perform data manipulation and  efficient 
> core allocation in solrcloud. This API was envisioned to be the base layer 
> that enables Solrcloud to be an auto scaling platform. (and work in unison 
> with other complementing monitoring and scaling features).
> Patch Status:
> ===
> The patch is work in progress and incremental. We have done a few rounds of 
> code clean up. We wanted to get the patch going first to get initial feed 
> back.  We will continue to work on making it more open source friendly and 
> easily testable.
>  Deployment Status:
> 
> The platform is deployed in production at bloomreach and has been battle 
> tested for large scale load. (millions of documents and hundreds of 
> collections).
>  Internals:
> =
> The internals of the API and performance : 
> http://engineering.bloomreach.com/solrcloud-rebalance-api/
> It is built on top of the admin collections API as an action (with various 
> flavors). At a high level, the rebalance api provides 2 constructs:
> Scaling Strategy:  Decides how to move the data.  Every flavor has multiple 
> options which can be reviewed in the api spec.
> Re-distribute  - Move around data in the cluster based on capacity/allocation.
> Auto Shard  - Dynamically shard a collection to any size.
> Smart Merge - Distributed Mode - Helps merging data from a larger shard setup 
> into smaller one.  (the source should be divisible by destination)
> Scale up -  Add replicas on the fly
> Scale Down - Remove replicas on the fly
> Allocation Strategy:  Decides where to put the data.  (Nodes with least 
> cores, Nodes that do not have this collection etc). Custom implementations 
> can be built on top as well. One other example is Availability Zone aware. 
> Distribute data such that every replica is placed on different availability 
> zone to support HA.
>  Detailed API Spec:
> 
>   https://github.com/bloomreach/solrcloud-rebalance-api
>  Contributors:
> =
>   Nitin Sharma
>   Suruchi Shah
>  Questions/Comments:
> =
>   You can reach me at nitin...@gmail.com



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12155) Solr 7.2.1 deadlock in UnInvertedField.getUnInvertedField()

2018-04-04 Thread Mikhail Khludnev (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-12155?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426182#comment-16426182
 ] 

Mikhail Khludnev commented on SOLR-12155:
-

Attaching a sort of solution [^SOLR-12155.patch]. At least it unlocks 
concurrently loading thread, for the price of repeating hopeless initialisation 
attempts or causing NPE in calling code. After test injection is switched on, 
UIF is successfully initialised with the single instance. Is there any vetoes 
or votes?   

> Solr 7.2.1 deadlock in UnInvertedField.getUnInvertedField() 
> 
>
> Key: SOLR-12155
> URL: https://issues.apache.org/jira/browse/SOLR-12155
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>Affects Versions: 7.2.1
>Reporter: Kishor gandham
>Priority: Major
> Attachments: SOLR-12155.patch, SOLR-12155.patch, stack.txt
>
>
> I am attaching a stack trace from our production Solr (7.2.1). Occasionally, 
> we are seeing SOLR becoming unresponsive. We are then forced to kill the JVM 
> and start solr again.
> We have a lot of facet queries and our index has approximately 15 million 
> documents. We have recently started using json.facet queries and some of the 
> facet fields use DocValues.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Assigned] (LUCENE-8238) WordDelimiterFilter javadocs reference nonexistent parameters

2018-04-04 Thread Mike Sokolov (JIRA)

 [ 
https://issues.apache.org/jira/browse/LUCENE-8238?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mike Sokolov reassigned LUCENE-8238:


Assignee: (was: Mike Sokolov)

> WordDelimiterFilter javadocs reference nonexistent parameters
> -
>
> Key: LUCENE-8238
> URL: https://issues.apache.org/jira/browse/LUCENE-8238
> Project: Lucene - Core
>  Issue Type: Bug
>  Components: modules/analysis
>Reporter: Mike Sokolov
>Priority: Minor
> Fix For: trunk
>
> Attachments: WDGF.patch
>
>
> The javadocs for both WDF and WDGF include a pretty detailed discussion about 
> the proper use of the "combinations" parameter, but no such parameter exists. 
> I don't know the history here, but it sounds as if the docs might be 
> referring to some previous incarnation of this filter, perhaps in the context 
> of some (now-defunct) Solr configuration.
> The docs should be updated to reference the actual option names that are 
> provided by the class today.
>  
> I've attached a patch



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Created] (LUCENE-8238) WordDelimiterFilter javadocs reference nonexistent parameters

2018-04-04 Thread Mike Sokolov (JIRA)
Mike Sokolov created LUCENE-8238:


 Summary: WordDelimiterFilter javadocs reference nonexistent 
parameters
 Key: LUCENE-8238
 URL: https://issues.apache.org/jira/browse/LUCENE-8238
 Project: Lucene - Core
  Issue Type: Bug
  Components: modules/analysis
Reporter: Mike Sokolov
Assignee: Mike Sokolov
 Fix For: trunk
 Attachments: WDGF.patch

The javadocs for both WDF and WDGF include a pretty detailed discussion about 
the proper use of the "combinations" parameter, but no such parameter exists. I 
don't know the history here, but it sounds as if the docs might be referring to 
some previous incarnation of this filter, perhaps in the context of some 
(now-defunct) Solr configuration.

The docs should be updated to reference the actual option names that are 
provided by the class today.
 
I've attached a patch



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-12155) Solr 7.2.1 deadlock in UnInvertedField.getUnInvertedField()

2018-04-04 Thread Mikhail Khludnev (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-12155?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mikhail Khludnev updated SOLR-12155:

Attachment: SOLR-12155.patch

> Solr 7.2.1 deadlock in UnInvertedField.getUnInvertedField() 
> 
>
> Key: SOLR-12155
> URL: https://issues.apache.org/jira/browse/SOLR-12155
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>Affects Versions: 7.2.1
>Reporter: Kishor gandham
>Priority: Major
> Attachments: SOLR-12155.patch, SOLR-12155.patch, stack.txt
>
>
> I am attaching a stack trace from our production Solr (7.2.1). Occasionally, 
> we are seeing SOLR becoming unresponsive. We are then forced to kill the JVM 
> and start solr again.
> We have a lot of facet queries and our index has approximately 15 million 
> documents. We have recently started using json.facet queries and some of the 
> facet fields use DocValues.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12185) Can't change Single Valued field to Multi Valued even by deleting/readding

2018-04-04 Thread Shawn Heisey (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-12185?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426167#comment-16426167
 ] 

Shawn Heisey commented on SOLR-12185:
-

I've attempted to ask on the #lucene IRC channel whether this is something they 
could fix.  I didn't get an answer.  You could try and get it fixed on the 
Lucene side, but I think there's a good chance that they'll say it *can't* be 
fixed, because you already have data in the index with one docValues 
designation.  I bet they'll say that you can't add more data with a different 
designation until that previous data is entirely gone.  Not just deleted -- but 
GONE.  Deleted docs are only *marked* as deleted.

> Can't change Single Valued field to Multi Valued even by deleting/readding
> --
>
> Key: SOLR-12185
> URL: https://issues.apache.org/jira/browse/SOLR-12185
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: Schema and Analysis
>Affects Versions: 7.1
>Reporter: Cetra Free
>Priority: Major
>
> Changing a single-valued field to multi-valued field with doc values breaks 
> things.  This doesn't matter if you change the field or do a complete delete 
> and re-add of the field.  The only way I have found to "fix" this is to 
> delete the entire core from disk and re-add it.
> h2. Steps to replicate:
>  * Create a field, make it single valued with doc values
>  * Index a couple of docs
>  * Delete the field
>  * Add the field again with the same name, but change it to multiValued
>  * Try indexing a couple of docs
> h2. Expected result:
> The documents are indexed correctly and there are no issues
> h2. Actual outcome:
> The documents refuse to be indexed and you see this in the logs:
> {code:java}
> org.apache.solr.common.SolrException: Exception writing document id 
> 6a3226c8-c904-40d7-aecb-76c3515db7b8 to the index; possible analysis error: 
> cannot change DocValues type from SORTED to SORTED_SET for field 
> "example_field"
>     at 
> org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:221)
>     at 
> org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:67)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.DistributedUpdateProcessor.doLocalAdd(DistributedUpdateProcessor.java:991)
>     at 
> org.apache.solr.update.processor.DistributedUpdateProcessor.versionAdd(DistributedUpdateProcessor.java:1207)
>     at 
> org.apache.solr.update.processor.DistributedUpdateProcessor.processAdd(DistributedUpdateProcessor.java:753)
>     at 
> org.apache.solr.update.processor.LogUpdateProcessorFactory$LogUpdateProcessor.processAdd(LogUpdateProcessorFactory.java:103)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.AddSchemaFieldsUpdateProcessorFactory$AddSchemaFieldsUpdateProcessor.processAdd(AddSchemaFieldsUpdateProcessorFactory.java:474)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldNameMutatingUpdateProcessorFactory$1.processAdd(FieldNameMutatingUpdateProcessorFactory.java:74)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
>     at 
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
>     at 
> 

[JENKINS] Lucene-Solr-NightlyTests-master - Build # 1521 - Unstable

2018-04-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-NightlyTests-master/1521/

1 tests failed.
FAILED:  
org.apache.lucene.index.TestSoftDeletesRetentionMergePolicy.testKeepAllDocsAcrossMerges

Error Message:
expected:<3> but was:<2>

Stack Trace:
java.lang.AssertionError: expected:<3> but was:<2>
at 
__randomizedtesting.SeedInfo.seed([3EE6351609FCFBAA:EC4836981595B0DB]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.failNotEquals(Assert.java:647)
at org.junit.Assert.assertEquals(Assert.java:128)
at org.junit.Assert.assertEquals(Assert.java:472)
at org.junit.Assert.assertEquals(Assert.java:456)
at 
org.apache.lucene.index.TestSoftDeletesRetentionMergePolicy.testKeepAllDocsAcrossMerges(TestSoftDeletesRetentionMergePolicy.java:189)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.lang.Thread.run(Thread.java:748)




Build Log:
[...truncated 405 lines...]
   [junit4] Suite: org.apache.lucene.index.TestSoftDeletesRetentionMergePolicy
   [junit4]   2> NOTE: download the large Jenkins line-docs file by running 
'ant get-jenkins-line-docs' in the lucene directory.
   [junit4]   2> NOTE: reproduce with: ant test  
-Dtestcase=TestSoftDeletesRetentionMergePolicy 
-Dtests.method=testKeepAllDocsAcrossMerges -Dtests.seed=3EE6351609FCFBAA 
-Dtests.multiplier=2 -Dtests.nightly=true 

[jira] [Commented] (SOLR-11982) Add support for indicating preferred replica types for queries

2018-04-04 Thread JIRA

[ 
https://issues.apache.org/jira/browse/SOLR-11982?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426135#comment-16426135
 ] 

Tomás Fernández Löbbe commented on SOLR-11982:
--

[~emaijala] Patch looks really good (it seems to have an unwanted change in 
{{TimeOut.java}}, I'll just skip that). I'll run some tests locally but I think 
it's ready to commit regardless of the final naming decision. I can just change 
the param name to whatever we agree on.

Thanks for your input [~houstonputman], I like {{shards.routing}}. I'll rename 
to that if nobody disagrees.

> Add support for indicating preferred replica types for queries
> --
>
> Key: SOLR-11982
> URL: https://issues.apache.org/jira/browse/SOLR-11982
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: SolrCloud
>Affects Versions: 7.4, master (8.0)
>Reporter: Ere Maijala
>Priority: Minor
>  Labels: patch-available, patch-with-test
> Attachments: SOLR-11982-preferReplicaTypes.patch, 
> SOLR-11982-preferReplicaTypes.patch, SOLR-11982.patch, SOLR-11982.patch, 
> SOLR-11982.patch, SOLR-11982.patch, SOLR-11982.patch, SOLR-11982.patch, 
> SOLR-11982.patch, SOLR-11982.patch
>
>
> It would be nice to have the possibility to easily sort the shards in the 
> preferred order e.g. by replica type. The attached patch adds support for 
> {{shards.sort}} parameter that allows one to sort e.g. PULL and TLOG replicas 
> first with \{{shards.sort=replicaType:PULL|TLOG }}(which would mean that NRT 
> replicas wouldn't be hit with queries unless they're the only ones available) 
> and/or to sort by replica location (like preferLocalShards=true but more 
> versatile).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12036) factor out DefaultStreamFactory class

2018-04-04 Thread Christine Poerschke (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-12036?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426106#comment-16426106
 ] 

Christine Poerschke commented on SOLR-12036:


Attached updated patch, building upon [~joel.bernstein]'s SOLR-12174 
refactoring.

> factor out DefaultStreamFactory class
> -
>
> Key: SOLR-12036
> URL: https://issues.apache.org/jira/browse/SOLR-12036
> Project: Solr
>  Issue Type: Task
>  Components: streaming expressions
>Reporter: Christine Poerschke
>Priority: Minor
> Attachments: SOLR-12036.patch, SOLR-12036.patch
>
>
> Motivation for the proposed class is to reduce the need for 
> {{withFunctionName}} method calls in client code.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-12036) factor out DefaultStreamFactory class

2018-04-04 Thread Christine Poerschke (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-12036?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Christine Poerschke updated SOLR-12036:
---
Attachment: SOLR-12036.patch

> factor out DefaultStreamFactory class
> -
>
> Key: SOLR-12036
> URL: https://issues.apache.org/jira/browse/SOLR-12036
> Project: Solr
>  Issue Type: Task
>  Components: streaming expressions
>Reporter: Christine Poerschke
>Priority: Minor
> Attachments: SOLR-12036.patch, SOLR-12036.patch
>
>
> Motivation for the proposed class is to reduce the need for 
> {{withFunctionName}} method calls in client code.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS-EA] Lucene-Solr-7.3-Linux (64bit/jdk-11-ea+5) - Build # 121 - Unstable!

2018-04-04 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.3-Linux/121/
Java: 64bit/jdk-11-ea+5 -XX:-UseCompressedOops -XX:+UseParallelGC

1 tests failed.
FAILED:  
org.apache.solr.cloud.TestCloudRecovery.leaderRecoverFromLogOnStartupTest

Error Message:
expected:<4> but was:<2>

Stack Trace:
java.lang.AssertionError: expected:<4> but was:<2>
at 
__randomizedtesting.SeedInfo.seed([A07A1A56521AF59C:D48AFB13163EB213]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.failNotEquals(Assert.java:647)
at org.junit.Assert.assertEquals(Assert.java:128)
at org.junit.Assert.assertEquals(Assert.java:472)
at org.junit.Assert.assertEquals(Assert.java:456)
at 
org.apache.solr.cloud.TestCloudRecovery.leaderRecoverFromLogOnStartupTest(TestCloudRecovery.java:102)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.base/java.lang.Thread.run(Thread.java:841)




Build Log:
[...truncated 13197 lines...]
   [junit4] Suite: org.apache.solr.cloud.TestCloudRecovery
   [junit4]   2> 871188 

Re: TestSSLRandomization is failing everytime

2018-04-04 Thread Joel Bernstein
This looks a bug in JavaKeyStore. The source code is here:

http://grepcode.com/file_/repository.grepcode.com/java/root/jdk/openjdk/8u40-b25/sun/security/provider/JavaKeyStore.java/?v=source

The getPreKeyedHash method is where MessageDigest.getInstance("SHA") is
called. From everything I've read this code is incorrect because SHA is not
a valid algorithm.

Joel Bernstein
http://joelsolr.blogspot.com/

On Wed, Apr 4, 2018 at 3:35 PM, Joel Bernstein  wrote:

> The code below ran fine from the command line and from a basic test case:
> System.out.println(javax.net.ssl.SSLContext.getDefault().getProvider());
> System.out.println(javax.net.ssl.SSLContext.getDefault().getProtocol());
>
> The source code that throws the exception in JavaKeyStore.engineLoad is:
>
> if (password != null) {
> byte computed[], actual[];
> computed = md.digest();
> actual = new byte[computed.length];
> dis.readFully(actual);
> for (int i = 0; i < computed.length; i++) {
> if (computed[i] != actual[i]) {
> Throwable t = new UnrecoverableKeyException
> ("Password verification failed");
> throw (IOException)new IOException
> ("Keystore was tampered with, or "
> + "password was incorrect").initCause(t);
> }
> }
> }
>
>
> Notice that it's simply comparing the bytes from two digests.
>
> The digests are prepared using a SHA digest, notice that it just specifies
> SHA, which must choose the default SHA digest for the system it's on. If
> it choses a different SHA digest the password would not match. My best bet
> right now is that I've changed my default SHA digest to be something
> other then what was used to create the passwords for the test framework:
>
>
> /**
>  * To guard against tampering with the keystore, we append a keyed
>  * hash with a bit of whitener.
>  */
> private MessageDigest getPreKeyedHash(char[] password)
> throws NoSuchAlgorithmException, UnsupportedEncodingException
> {
> int i, j;
>
> MessageDigest md = MessageDigest.getInstance("SHA");
> byte[] passwdBytes = new byte[password.length * 2];
> for (i=0, j=0; i passwdBytes[j++] = (byte)(password[i] >> 8);
> passwdBytes[j++] = (byte)password[i];
> }
> md.update(passwdBytes);
> for (i=0; i passwdBytes[i] = 0;
> md.update("Mighty Aphrodite".getBytes("UTF8"));
> return md;
> }
>
>
>
>
>
>
> Joel Bernstein
> http://joelsolr.blogspot.com/
>
> On Wed, Apr 4, 2018 at 2:11 PM, Joel Bernstein  wrote:
>
>> Thanks Hoss, I will give this a try.
>>
>> Joel Bernstein
>> http://joelsolr.blogspot.com/
>>
>> On Wed, Apr 4, 2018 at 1:42 PM, Chris Hostetter > > wrote:
>>
>>>
>>> : I suspect I hosed something to do with my root certs on my local
>>> machine.
>>> : Fairly recently I was playing around with these certs while doing some
>>> SSL
>>> : work for Alfresco. This should be fun to fix...
>>>
>>> if that's your suspicion, i would start by testing out a simple java app
>>> that does nothing but...
>>>
>>> public static void main(String[] args) throws Exception {
>>> System.out.println(javax.net.ssl.SSLContext.getDefault().get
>>> Provider());
>>> System.out.println(javax.net.ssl.SSLContext.getDefault().get
>>> Protocol());
>>> }
>>>
>>> ...if thta fails on the commandline, then ou definitely hozed your
>>> machine.
>>>
>>> If that *does* work on the commandline, then try the same code in a
>>> trivial Junit test that just subclasses LuceneTestCase -- NOT
>>> SolrTestCaseJ4 -- to see if the problem is somewhere in our ant/lucene
>>> build setup, independent of any Solr SSL randomization.
>>>
>>> :
>>> : Joel Bernstein
>>> : http://joelsolr.blogspot.com/
>>> :
>>> : On Wed, Apr 4, 2018 at 12:29 PM, Joel Bernstein 
>>> wrote:
>>> :
>>> : > Ok, so it does sounds like a local problem then. Nothing much has
>>> changed
>>> : > locally. I'm still using the same Mac and Java version:
>>> : >
>>> : > defaultuildsMBP:clone2 joelbernstein$ java -version
>>> : >
>>> : > java version "1.8.0_40"
>>> : >
>>> : > Java(TM) SE Runtime Environment (build 1.8.0_40-b27)
>>> : >
>>> : > Java HotSpot(TM) 64-Bit Server VM (build 25.40-b25, mixed mode)
>>> : >
>>> : > I'll try running on a newer version of Java.
>>> : >
>>> : >
>>> : >
>>> : > Joel Bernstein
>>> : > http://joelsolr.blogspot.com/
>>> : >
>>> : > On Wed, Apr 4, 2018 at 12:19 PM, Chris Hostetter <
>>> hossman_luc...@fucit.org
>>> : > > wrote:
>>> : >
>>> : >>
>>> : >> : Subject: Re: TestSSLRandomization is failing everytime
>>> : >>
>>> : >> : When I run locally I get 

Re: TestSSLRandomization is failing everytime

2018-04-04 Thread Joel Bernstein
The code below ran fine from the command line and from a basic test case:
System.out.println(javax.net.ssl.SSLContext.getDefault().getProvider());
System.out.println(javax.net.ssl.SSLContext.getDefault().getProtocol());

The source code that throws the exception in JavaKeyStore.engineLoad is:

if (password != null) {
byte computed[], actual[];
computed = md.digest();
actual = new byte[computed.length];
dis.readFully(actual);
for (int i = 0; i < computed.length; i++) {
if (computed[i] != actual[i]) {
Throwable t = new UnrecoverableKeyException
("Password verification failed");
throw (IOException)new IOException
("Keystore was tampered with, or "
+ "password was incorrect").initCause(t);
}
}
}


Notice that it's simply comparing the bytes from two digests.

The digests are prepared using a SHA digest, notice that it just specifies
SHA, which must choose the default SHA digest for the system it's on. If
it choses a different SHA digest the password would not match. My best bet
right now is that I've changed my default SHA digest to be something
other then what was used to create the passwords for the test framework:


/**
 * To guard against tampering with the keystore, we append a keyed
 * hash with a bit of whitener.
 */
private MessageDigest getPreKeyedHash(char[] password)
throws NoSuchAlgorithmException, UnsupportedEncodingException
{
int i, j;

MessageDigest md = MessageDigest.getInstance("SHA");
byte[] passwdBytes = new byte[password.length * 2];
for (i=0, j=0; i> 8);
passwdBytes[j++] = (byte)password[i];
}
md.update(passwdBytes);
for (i=0; i wrote:

> Thanks Hoss, I will give this a try.
>
> Joel Bernstein
> http://joelsolr.blogspot.com/
>
> On Wed, Apr 4, 2018 at 1:42 PM, Chris Hostetter 
> wrote:
>
>>
>> : I suspect I hosed something to do with my root certs on my local
>> machine.
>> : Fairly recently I was playing around with these certs while doing some
>> SSL
>> : work for Alfresco. This should be fun to fix...
>>
>> if that's your suspicion, i would start by testing out a simple java app
>> that does nothing but...
>>
>> public static void main(String[] args) throws Exception {
>> System.out.println(javax.net.ssl.SSLContext.getDefault().get
>> Provider());
>> System.out.println(javax.net.ssl.SSLContext.getDefault().get
>> Protocol());
>> }
>>
>> ...if thta fails on the commandline, then ou definitely hozed your
>> machine.
>>
>> If that *does* work on the commandline, then try the same code in a
>> trivial Junit test that just subclasses LuceneTestCase -- NOT
>> SolrTestCaseJ4 -- to see if the problem is somewhere in our ant/lucene
>> build setup, independent of any Solr SSL randomization.
>>
>> :
>> : Joel Bernstein
>> : http://joelsolr.blogspot.com/
>> :
>> : On Wed, Apr 4, 2018 at 12:29 PM, Joel Bernstein 
>> wrote:
>> :
>> : > Ok, so it does sounds like a local problem then. Nothing much has
>> changed
>> : > locally. I'm still using the same Mac and Java version:
>> : >
>> : > defaultuildsMBP:clone2 joelbernstein$ java -version
>> : >
>> : > java version "1.8.0_40"
>> : >
>> : > Java(TM) SE Runtime Environment (build 1.8.0_40-b27)
>> : >
>> : > Java HotSpot(TM) 64-Bit Server VM (build 25.40-b25, mixed mode)
>> : >
>> : > I'll try running on a newer version of Java.
>> : >
>> : >
>> : >
>> : > Joel Bernstein
>> : > http://joelsolr.blogspot.com/
>> : >
>> : > On Wed, Apr 4, 2018 at 12:19 PM, Chris Hostetter <
>> hossman_luc...@fucit.org
>> : > > wrote:
>> : >
>> : >>
>> : >> : Subject: Re: TestSSLRandomization is failing everytime
>> : >>
>> : >> : When I run locally I get this stack trace:
>> : >>
>> : >> would be helpful to konw the branch, and the GIT SHA ... and if you
>> can
>> : >> reproduce if you checkout an older branch/SHA where you know you
>> didn't
>> : >> see this failure in the past (ex: the last SHA you committed, where
>> you
>> : >> should have run all tests to be certain you didn't break anything)
>> : >>
>> : >> Personally I can't reproduce on master/8e276b90f520d ...
>> : >>
>> : >> Let's look at the exception...
>> : >>
>> : >> :[junit4]> Caused by: java.lang.RuntimeException: Unable to
>> : >> : initialize 'Default' SSLContext Algorithm, JVM is borked
>> : >> :
>> : >> :[junit4]> at
>> : 

[jira] [Commented] (LUCENE-7935) Release .sha512 hash files with our artifacts

2018-04-04 Thread JIRA

[ 
https://issues.apache.org/jira/browse/LUCENE-7935?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426070#comment-16426070
 ] 

Jan Høydahl commented on LUCENE-7935:
-

Looks like a mismatch here, the maven artifacts build still produces md5 and 
sha1 files, while the smoke tester tries to verify sha512.

I suppose we want to discontinue md5 also for all maven artifacts, or is it 
required?

> Release .sha512 hash files with our artifacts
> -
>
> Key: LUCENE-7935
> URL: https://issues.apache.org/jira/browse/LUCENE-7935
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: general/build
>Reporter: Jan Høydahl
>Assignee: Jan Høydahl
>Priority: Major
> Fix For: 7.4, master (8.0)
>
> Attachments: LUCENE-7935.patch, LUCENE-7935.patch
>
>
> Currently we are only required to release {{.md5}} hashes with our artifacts, 
> and we also include {{.sha1}} files. It is expected that {{.sha512}} will be 
> required in the future (see 
> https://www.apache.org/dev/release-signing.html#sha1), so why not start 
> generating them right away?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-10616) use more ant variables in ref guide pages: particular for javadoc & third-party lib versions

2018-04-04 Thread Hoss Man (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-10616?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hoss Man updated SOLR-10616:

Attachment: SOLR-10616.patch

> use more ant variables in ref guide pages: particular for javadoc & 
> third-party lib versions
> 
>
> Key: SOLR-10616
> URL: https://issues.apache.org/jira/browse/SOLR-10616
> Project: Solr
>  Issue Type: Sub-task
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: documentation
>Reporter: Hoss Man
>Priority: Major
> Attachments: SOLR-10616.patch
>
>
> we already use ant variables for the lucene/solr version when building 
> lucene/solr javadoc links, but it would be nice if we could slurp in the JDK 
> javadoc URLs for the current java version & the versions.properties values 
> for all third-party deps as well, so that links to things like the zookeeper 
> guide, or the tika guide, or the javadocs for DateInstance would always be 
> "current"



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10616) use more ant variables in ref guide pages: particular for javadoc & third-party lib versions

2018-04-04 Thread Hoss Man (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10616?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426015#comment-16426015
 ] 

Hoss Man commented on SOLR-10616:
-

I've worked up a patch that:
* adds a new {{java-javadocs}} variable for use in our {{*.adoc}} files
* replaces existing hardcoded {{docs.oracle.com/...}} javadoc urls with this 
variable
* removes a lot of unneccessarily specific mentions of "java 8" and just refers 
to "java" ...

The more I looked into references to "java 8" or "java 1.8" the less convinced 
i was that a variable would be useful here.  In every case i found, i felt like 
either the docs were improved by being less specific, *OR* using a variable 
could easily lead to confusion/mistakes if/when we change our minimum version 
to java 9 ... in the latter cases, i made a note to revisit those specific 
sections in LUCENE-8154.



> use more ant variables in ref guide pages: particular for javadoc & 
> third-party lib versions
> 
>
> Key: SOLR-10616
> URL: https://issues.apache.org/jira/browse/SOLR-10616
> Project: Solr
>  Issue Type: Sub-task
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: documentation
>Reporter: Hoss Man
>Priority: Major
> Attachments: SOLR-10616.patch
>
>
> we already use ant variables for the lucene/solr version when building 
> lucene/solr javadoc links, but it would be nice if we could slurp in the JDK 
> javadoc URLs for the current java version & the versions.properties values 
> for all third-party deps as well, so that links to things like the zookeeper 
> guide, or the tika guide, or the javadocs for DateInstance would always be 
> "current"



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-7887) Upgrade Solr to use log4j2 -- log4j 1 now officially end of life

2018-04-04 Thread Steve Rowe (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-7887?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426004#comment-16426004
 ] 

Steve Rowe commented on SOLR-7887:
--

I'll leave the issue open until Jenkins Maven builds succeed.

> Upgrade Solr to use log4j2 -- log4j 1 now officially end of life
> 
>
> Key: SOLR-7887
> URL: https://issues.apache.org/jira/browse/SOLR-7887
> Project: Solr
>  Issue Type: Task
>Reporter: Shawn Heisey
>Assignee: Erick Erickson
>Priority: Major
> Fix For: 7.4
>
> Attachments: SOLR-7887-WIP.patch, SOLR-7887-eoe-review.patch, 
> SOLR-7887-eoe-review.patch, SOLR-7887-fix-maven-compilation.patch, 
> SOLR-7887-followup_1.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887_followup_2.patch, SOLR-7887_followup_2.patch
>
>
> The logging services project has officially announced the EOL of log4j 1:
> https://blogs.apache.org/foundation/entry/apache_logging_services_project_announces
> In the official binary jetty deployment, we use use log4j 1.2 as our final 
> logging destination, so the admin UI has a log watcher that actually uses 
> log4j and java.util.logging classes.  That will need to be extended to add 
> log4j2.  I think that might be the largest pain point to this upgrade.
> There is some crossover between log4j2 and slf4j.  Figuring out exactly which 
> jars need to be in the lib/ext directory will take some research.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-7887) Upgrade Solr to use log4j2 -- log4j 1 now officially end of life

2018-04-04 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-7887?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426001#comment-16426001
 ] 

ASF subversion and git services commented on SOLR-7887:
---

Commit ef902f9d8e5808cb874604990c7df1230e51f28c in lucene-solr's branch 
refs/heads/master from [~steve_rowe]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=ef902f9 ]

SOLR-7887: fix maven compilation by turning off annotation processing


> Upgrade Solr to use log4j2 -- log4j 1 now officially end of life
> 
>
> Key: SOLR-7887
> URL: https://issues.apache.org/jira/browse/SOLR-7887
> Project: Solr
>  Issue Type: Task
>Reporter: Shawn Heisey
>Assignee: Erick Erickson
>Priority: Major
> Fix For: 7.4
>
> Attachments: SOLR-7887-WIP.patch, SOLR-7887-eoe-review.patch, 
> SOLR-7887-eoe-review.patch, SOLR-7887-fix-maven-compilation.patch, 
> SOLR-7887-followup_1.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887_followup_2.patch, SOLR-7887_followup_2.patch
>
>
> The logging services project has officially announced the EOL of log4j 1:
> https://blogs.apache.org/foundation/entry/apache_logging_services_project_announces
> In the official binary jetty deployment, we use use log4j 1.2 as our final 
> logging destination, so the admin UI has a log watcher that actually uses 
> log4j and java.util.logging classes.  That will need to be extended to add 
> log4j2.  I think that might be the largest pain point to this upgrade.
> There is some crossover between log4j2 and slf4j.  Figuring out exactly which 
> jars need to be in the lib/ext directory will take some research.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-7887) Upgrade Solr to use log4j2 -- log4j 1 now officially end of life

2018-04-04 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-7887?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426000#comment-16426000
 ] 

ASF subversion and git services commented on SOLR-7887:
---

Commit 27e5c8dd31c027e358cd1a780ba155ed7e5822bb in lucene-solr's branch 
refs/heads/branch_7x from [~steve_rowe]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=27e5c8d ]

SOLR-7887: fix maven compilation by turning off annotation processing


> Upgrade Solr to use log4j2 -- log4j 1 now officially end of life
> 
>
> Key: SOLR-7887
> URL: https://issues.apache.org/jira/browse/SOLR-7887
> Project: Solr
>  Issue Type: Task
>Reporter: Shawn Heisey
>Assignee: Erick Erickson
>Priority: Major
> Fix For: 7.4
>
> Attachments: SOLR-7887-WIP.patch, SOLR-7887-eoe-review.patch, 
> SOLR-7887-eoe-review.patch, SOLR-7887-fix-maven-compilation.patch, 
> SOLR-7887-followup_1.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887_followup_2.patch, SOLR-7887_followup_2.patch
>
>
> The logging services project has officially announced the EOL of log4j 1:
> https://blogs.apache.org/foundation/entry/apache_logging_services_project_announces
> In the official binary jetty deployment, we use use log4j 1.2 as our final 
> logging destination, so the admin UI has a log watcher that actually uses 
> log4j and java.util.logging classes.  That will need to be extended to add 
> log4j2.  I think that might be the largest pain point to this upgrade.
> There is some crossover between log4j2 and slf4j.  Figuring out exactly which 
> jars need to be in the lib/ext directory will take some research.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-7887) Upgrade Solr to use log4j2 -- log4j 1 now officially end of life

2018-04-04 Thread Steve Rowe (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-7887?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16425998#comment-16425998
 ] 

Steve Rowe commented on SOLR-7887:
--

I pulled out the yetus audience-annotations bits from the last patch on this 
issue that had them (from March 2nd), then tested that maven compilation worked 
with it, and that precommit was happy. All fine.

But something in the patch caught my eye:
{noformat}
diff --git a/lucene/common-build.xml b/lucene/common-build.xml
index 4fa59ac936..e13f09bd6d 100644
--- a/lucene/common-build.xml
+++ b/lucene/common-build.xml
@@ -187,7 +187,8 @@
   
 
   
-  
+  
+  
   
   
   
{noformat}
This explains why Ant compilation works now (and didn't in earlier versions of 
the patch without this change): {{-proc:none}} turns off annotation processing, 
and the compilation failures are due to ZooKeeper's use of (Yetus) annotations.

So in [^SOLR-7887-fix-maven-compilation.patch] I added this arg to the 
\{{maven-compiler-plugin}} config, and now Maven compilation succeeds too.

I'll commit this patch shortly.

> Upgrade Solr to use log4j2 -- log4j 1 now officially end of life
> 
>
> Key: SOLR-7887
> URL: https://issues.apache.org/jira/browse/SOLR-7887
> Project: Solr
>  Issue Type: Task
>Reporter: Shawn Heisey
>Assignee: Erick Erickson
>Priority: Major
> Fix For: 7.4
>
> Attachments: SOLR-7887-WIP.patch, SOLR-7887-eoe-review.patch, 
> SOLR-7887-eoe-review.patch, SOLR-7887-fix-maven-compilation.patch, 
> SOLR-7887-followup_1.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887_followup_2.patch, SOLR-7887_followup_2.patch
>
>
> The logging services project has officially announced the EOL of log4j 1:
> https://blogs.apache.org/foundation/entry/apache_logging_services_project_announces
> In the official binary jetty deployment, we use use log4j 1.2 as our final 
> logging destination, so the admin UI has a log watcher that actually uses 
> log4j and java.util.logging classes.  That will need to be extended to add 
> log4j2.  I think that might be the largest pain point to this upgrade.
> There is some crossover between log4j2 and slf4j.  Figuring out exactly which 
> jars need to be in the lib/ext directory will take some research.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8154) TODO List when upgrading to Java 9 as minimum requirement

2018-04-04 Thread Hoss Man (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8154?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16425997#comment-16425997
 ] 

Hoss Man commented on LUCENE-8154:
--

Some ref-guide related stuff:
 * {{solr-system-requirements.adoc}} - misc changes
 * {{aws-solrcloud-tutorial.adoc}} – refer to the new java 9 yum package names
 * {{kerberos-authentication-plugin.adoc}} – we can completely remove the info 
about downloading & updating {{local_policy.jar}} to support unlimited crypto – 
that's include by default in java9

> TODO List when upgrading to Java 9 as minimum requirement
> -
>
> Key: LUCENE-8154
> URL: https://issues.apache.org/jira/browse/LUCENE-8154
> Project: Lucene - Core
>  Issue Type: Task
>Reporter: Uwe Schindler
>Priority: Major
>  Labels: Java9
>
> This issue is just a placeholder to record stuff that needs to be done when 
> we upgrade to Java 9 as minimum requirement for running Lucene/Solr:
> - Remove {{FutureArrays}} and {{FutureObjects}} from source tree and change 
> code to use Java 9 native methods. Disable MR-JAR building (maybe only 
> disable so we can reuse at later stages)
> - Remove Java 8 bytebuffer unmapping code
> Final stuff:
> - When upgrading to Java 9, don't delete the Java 9 specific stuff for 
> Multi-Release testing from build files or smoke tester! Keep it alive, maybe 
> migrate to later Java (e.g. LTS-Java)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-7887) Upgrade Solr to use log4j2 -- log4j 1 now officially end of life

2018-04-04 Thread Steve Rowe (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-7887?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Steve Rowe updated SOLR-7887:
-
Attachment: SOLR-7887-fix-maven-compilation.patch

> Upgrade Solr to use log4j2 -- log4j 1 now officially end of life
> 
>
> Key: SOLR-7887
> URL: https://issues.apache.org/jira/browse/SOLR-7887
> Project: Solr
>  Issue Type: Task
>Reporter: Shawn Heisey
>Assignee: Erick Erickson
>Priority: Major
> Fix For: 7.4
>
> Attachments: SOLR-7887-WIP.patch, SOLR-7887-eoe-review.patch, 
> SOLR-7887-eoe-review.patch, SOLR-7887-fix-maven-compilation.patch, 
> SOLR-7887-followup_1.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887_followup_2.patch, SOLR-7887_followup_2.patch
>
>
> The logging services project has officially announced the EOL of log4j 1:
> https://blogs.apache.org/foundation/entry/apache_logging_services_project_announces
> In the official binary jetty deployment, we use use log4j 1.2 as our final 
> logging destination, so the admin UI has a log watcher that actually uses 
> log4j and java.util.logging classes.  That will need to be extended to add 
> log4j2.  I think that might be the largest pain point to this upgrade.
> There is some crossover between log4j2 and slf4j.  Figuring out exactly which 
> jars need to be in the lib/ext directory will take some research.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: TestSSLRandomization is failing everytime

2018-04-04 Thread Joel Bernstein
Thanks Hoss, I will give this a try.

Joel Bernstein
http://joelsolr.blogspot.com/

On Wed, Apr 4, 2018 at 1:42 PM, Chris Hostetter 
wrote:

>
> : I suspect I hosed something to do with my root certs on my local machine.
> : Fairly recently I was playing around with these certs while doing some
> SSL
> : work for Alfresco. This should be fun to fix...
>
> if that's your suspicion, i would start by testing out a simple java app
> that does nothing but...
>
> public static void main(String[] args) throws Exception {
> System.out.println(javax.net.ssl.SSLContext.getDefault().
> getProvider());
> System.out.println(javax.net.ssl.SSLContext.getDefault().
> getProtocol());
> }
>
> ...if thta fails on the commandline, then ou definitely hozed your
> machine.
>
> If that *does* work on the commandline, then try the same code in a
> trivial Junit test that just subclasses LuceneTestCase -- NOT
> SolrTestCaseJ4 -- to see if the problem is somewhere in our ant/lucene
> build setup, independent of any Solr SSL randomization.
>
> :
> : Joel Bernstein
> : http://joelsolr.blogspot.com/
> :
> : On Wed, Apr 4, 2018 at 12:29 PM, Joel Bernstein 
> wrote:
> :
> : > Ok, so it does sounds like a local problem then. Nothing much has
> changed
> : > locally. I'm still using the same Mac and Java version:
> : >
> : > defaultuildsMBP:clone2 joelbernstein$ java -version
> : >
> : > java version "1.8.0_40"
> : >
> : > Java(TM) SE Runtime Environment (build 1.8.0_40-b27)
> : >
> : > Java HotSpot(TM) 64-Bit Server VM (build 25.40-b25, mixed mode)
> : >
> : > I'll try running on a newer version of Java.
> : >
> : >
> : >
> : > Joel Bernstein
> : > http://joelsolr.blogspot.com/
> : >
> : > On Wed, Apr 4, 2018 at 12:19 PM, Chris Hostetter <
> hossman_luc...@fucit.org
> : > > wrote:
> : >
> : >>
> : >> : Subject: Re: TestSSLRandomization is failing everytime
> : >>
> : >> : When I run locally I get this stack trace:
> : >>
> : >> would be helpful to konw the branch, and the GIT SHA ... and if you
> can
> : >> reproduce if you checkout an older branch/SHA where you know you
> didn't
> : >> see this failure in the past (ex: the last SHA you committed, where
> you
> : >> should have run all tests to be certain you didn't break anything)
> : >>
> : >> Personally I can't reproduce on master/8e276b90f520d ...
> : >>
> : >> Let's look at the exception...
> : >>
> : >> :[junit4]> Caused by: java.lang.RuntimeException: Unable to
> : >> : initialize 'Default' SSLContext Algorithm, JVM is borked
> : >> :
> : >> :[junit4]> at
> : >> : org.apache.solr.cloud.TestMiniSolrCloudClusterSSL.(T
> : >> estMiniSolrCloudClusterSSL.java:67)
> : >> :
> : >> :[junit4]> ... 40 more
> : >> :
> : >> :[junit4]> Caused by: java.security.NoSuchAlgorithmException:
> : >> Error
> : >> : constructing implementation (algorithm: Default, provider: SunJSSE,
> : >> class:
> : >> : sun.security.ssl.SSLContextImpl$DefaultSSLContext)
> : >>
> : >> At first glance, it sounds like your JVM is completley jacked and
> doesn't
> : >> have any SSL support?
> : >>
> : >> The code throwing that exception is litterally...
> : >>
> : >>   DEFAULT_SSL_CONTEXT = SSLContext.getDefault();
> : >>
> : >> ...ie: your JVM is saying the *default* SSL Algorithm, as choosen by
> the
> : >> JVM config, doens't exist ... but if we look farther down...
> : >>
> : >> :[junit4]> Caused by: java.io.IOException: Keystore was
> tampered
> : >> : with, or password was incorrect
> : >> ...
> : >> :[junit4]> Caused by: java.security.
> UnrecoverableKeyException:
> : >> : Password verification failed
> : >>
> : >> ...well that's interesting.  We do provide our own keystore when using
> : >> SSLTestConfig, but I honestly don't know off the top of my head if
> that's
> : >> even in use when this code is running?
> : >>
> : >> Can you tell us *ANYTHING* about the machine/jvm where you are running
> : >> this, and or what might have changed on your end since hte last time
> you
> : >> ran all tests w/o failure?  what OS? new laptop? new java install? if
> you
> : >> "git co releases/lucene-solr/7.2.0" does this test pass? if so can you
> : >> "git bisect" to track down when it starts failing? etc...
> : >>
> : >>
> : >>
> : >> -Hoss
> : >> http://www.lucidworks.com/
> : >>
> : >> -
> : >> To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
> : >> For additional commands, e-mail: dev-h...@lucene.apache.org
> : >>
> : >>
> : >
> :
>
> -Hoss
> http://www.lucidworks.com/
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
> For additional commands, e-mail: dev-h...@lucene.apache.org
>
>


[JENKINS] Lucene-Solr-SmokeRelease-7.x - Build # 191 - Still Failing

2018-04-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-SmokeRelease-7.x/191/

No tests ran.

Build Log:
[...truncated 30148 lines...]
prepare-release-no-sign:
[mkdir] Created dir: 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/build/smokeTestRelease/dist
 [copy] Copying 491 files to 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/build/smokeTestRelease/dist/lucene
 [copy] Copying 230 files to 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/build/smokeTestRelease/dist/solr
   [smoker] Java 1.8 JAVA_HOME=/home/jenkins/tools/java/latest1.8
   [smoker] Java 9 JAVA_HOME=/home/jenkins/tools/java/latest1.9
   [smoker] NOTE: output encoding is UTF-8
   [smoker] 
   [smoker] Load release URL 
"file:/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/build/smokeTestRelease/dist/"...
   [smoker] 
   [smoker] Test Lucene...
   [smoker]   test basics...
   [smoker]   get KEYS
   [smoker] 0.2 MB in 0.01 sec (32.7 MB/sec)
   [smoker]   check changes HTML...
   [smoker]   download lucene-7.4.0-src.tgz...
   [smoker] 32.0 MB in 0.03 sec (1191.5 MB/sec)
   [smoker] verify sha1/sha512 digests
   [smoker]   download lucene-7.4.0.tgz...
   [smoker] 74.3 MB in 0.06 sec (1173.6 MB/sec)
   [smoker] verify sha1/sha512 digests
   [smoker]   download lucene-7.4.0.zip...
   [smoker] 84.8 MB in 0.07 sec (1168.5 MB/sec)
   [smoker] verify sha1/sha512 digests
   [smoker]   unpack lucene-7.4.0.tgz...
   [smoker] verify JAR metadata/identity/no javax.* or java.* classes...
   [smoker] test demo with 1.8...
   [smoker]   got 6324 hits for query "lucene"
   [smoker] checkindex with 1.8...
   [smoker] test demo with 9...
   [smoker]   got 6324 hits for query "lucene"
   [smoker] checkindex with 9...
   [smoker] check Lucene's javadoc JAR
   [smoker]   unpack lucene-7.4.0.zip...
   [smoker] verify JAR metadata/identity/no javax.* or java.* classes...
   [smoker] test demo with 1.8...
   [smoker]   got 6324 hits for query "lucene"
   [smoker] checkindex with 1.8...
   [smoker] test demo with 9...
   [smoker]   got 6324 hits for query "lucene"
   [smoker] checkindex with 9...
   [smoker] check Lucene's javadoc JAR
   [smoker]   unpack lucene-7.4.0-src.tgz...
   [smoker] make sure no JARs/WARs in src dist...
   [smoker] run "ant validate"
   [smoker] run tests w/ Java 8 and testArgs='-Dtests.badapples=false 
-Dtests.slow=false'...
   [smoker] test demo with 1.8...
   [smoker]   got 219 hits for query "lucene"
   [smoker] checkindex with 1.8...
   [smoker] generate javadocs w/ Java 8...
   [smoker] 
   [smoker] Crawl/parse...
   [smoker] 
   [smoker] Verify...
   [smoker] run tests w/ Java 9 and testArgs='-Dtests.badapples=false 
-Dtests.slow=false'...
   [smoker] test demo with 9...
   [smoker]   got 219 hits for query "lucene"
   [smoker] checkindex with 9...
   [smoker]   confirm all releases have coverage in TestBackwardsCompatibility
   [smoker] find all past Lucene releases...
   [smoker] run TestBackwardsCompatibility..
   [smoker] success!
   [smoker] 
   [smoker] Test Solr...
   [smoker]   test basics...
   [smoker]   get KEYS
   [smoker] 0.2 MB in 0.00 sec (264.8 MB/sec)
   [smoker]   check changes HTML...
   [smoker]   download solr-7.4.0-src.tgz...
   [smoker] 55.5 MB in 0.06 sec (913.1 MB/sec)
   [smoker] verify sha1/sha512 digests
   [smoker]   download solr-7.4.0.tgz...
   [smoker] 158.0 MB in 0.18 sec (884.4 MB/sec)
   [smoker] verify sha1/sha512 digests
   [smoker]   download solr-7.4.0.zip...
   [smoker] 159.0 MB in 0.19 sec (855.6 MB/sec)
   [smoker] verify sha1/sha512 digests
   [smoker]   unpack solr-7.4.0.tgz...
   [smoker] verify JAR metadata/identity/no javax.* or java.* classes...
   [smoker] unpack lucene-7.4.0.tgz...
   [smoker]   **WARNING**: skipping check of 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/build/smokeTestRelease/tmp/unpack/solr-7.4.0/contrib/dataimporthandler-extras/lib/javax.mail-1.5.1.jar:
 it has javax.* classes
   [smoker]   **WARNING**: skipping check of 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/build/smokeTestRelease/tmp/unpack/solr-7.4.0/contrib/dataimporthandler-extras/lib/activation-1.1.1.jar:
 it has javax.* classes
   [smoker] copying unpacked distribution for Java 8 ...
   [smoker] test solr example w/ Java 8...
   [smoker]   start Solr instance 
(log=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/build/smokeTestRelease/tmp/unpack/solr-7.4.0-java8/solr-example.log)...
   [smoker] No process found for Solr node running on port 8983
   [smoker]   Running techproducts example on port 8983 from 

Re: TestSSLRandomization is failing everytime

2018-04-04 Thread Chris Hostetter

: I suspect I hosed something to do with my root certs on my local machine.
: Fairly recently I was playing around with these certs while doing some SSL
: work for Alfresco. This should be fun to fix...

if that's your suspicion, i would start by testing out a simple java app 
that does nothing but...

public static void main(String[] args) throws Exception {
System.out.println(javax.net.ssl.SSLContext.getDefault().getProvider());
System.out.println(javax.net.ssl.SSLContext.getDefault().getProtocol());
}

...if thta fails on the commandline, then ou definitely hozed your 
machine.

If that *does* work on the commandline, then try the same code in a 
trivial Junit test that just subclasses LuceneTestCase -- NOT 
SolrTestCaseJ4 -- to see if the problem is somewhere in our ant/lucene 
build setup, independent of any Solr SSL randomization.

: 
: Joel Bernstein
: http://joelsolr.blogspot.com/
: 
: On Wed, Apr 4, 2018 at 12:29 PM, Joel Bernstein  wrote:
: 
: > Ok, so it does sounds like a local problem then. Nothing much has changed
: > locally. I'm still using the same Mac and Java version:
: >
: > defaultuildsMBP:clone2 joelbernstein$ java -version
: >
: > java version "1.8.0_40"
: >
: > Java(TM) SE Runtime Environment (build 1.8.0_40-b27)
: >
: > Java HotSpot(TM) 64-Bit Server VM (build 25.40-b25, mixed mode)
: >
: > I'll try running on a newer version of Java.
: >
: >
: >
: > Joel Bernstein
: > http://joelsolr.blogspot.com/
: >
: > On Wed, Apr 4, 2018 at 12:19 PM, Chris Hostetter  > wrote:
: >
: >>
: >> : Subject: Re: TestSSLRandomization is failing everytime
: >>
: >> : When I run locally I get this stack trace:
: >>
: >> would be helpful to konw the branch, and the GIT SHA ... and if you can
: >> reproduce if you checkout an older branch/SHA where you know you didn't
: >> see this failure in the past (ex: the last SHA you committed, where you
: >> should have run all tests to be certain you didn't break anything)
: >>
: >> Personally I can't reproduce on master/8e276b90f520d ...
: >>
: >> Let's look at the exception...
: >>
: >> :[junit4]> Caused by: java.lang.RuntimeException: Unable to
: >> : initialize 'Default' SSLContext Algorithm, JVM is borked
: >> :
: >> :[junit4]> at
: >> : org.apache.solr.cloud.TestMiniSolrCloudClusterSSL.(T
: >> estMiniSolrCloudClusterSSL.java:67)
: >> :
: >> :[junit4]> ... 40 more
: >> :
: >> :[junit4]> Caused by: java.security.NoSuchAlgorithmException:
: >> Error
: >> : constructing implementation (algorithm: Default, provider: SunJSSE,
: >> class:
: >> : sun.security.ssl.SSLContextImpl$DefaultSSLContext)
: >>
: >> At first glance, it sounds like your JVM is completley jacked and doesn't
: >> have any SSL support?
: >>
: >> The code throwing that exception is litterally...
: >>
: >>   DEFAULT_SSL_CONTEXT = SSLContext.getDefault();
: >>
: >> ...ie: your JVM is saying the *default* SSL Algorithm, as choosen by the
: >> JVM config, doens't exist ... but if we look farther down...
: >>
: >> :[junit4]> Caused by: java.io.IOException: Keystore was tampered
: >> : with, or password was incorrect
: >> ...
: >> :[junit4]> Caused by: java.security.UnrecoverableKeyException:
: >> : Password verification failed
: >>
: >> ...well that's interesting.  We do provide our own keystore when using
: >> SSLTestConfig, but I honestly don't know off the top of my head if that's
: >> even in use when this code is running?
: >>
: >> Can you tell us *ANYTHING* about the machine/jvm where you are running
: >> this, and or what might have changed on your end since hte last time you
: >> ran all tests w/o failure?  what OS? new laptop? new java install? if you
: >> "git co releases/lucene-solr/7.2.0" does this test pass? if so can you
: >> "git bisect" to track down when it starts failing? etc...
: >>
: >>
: >>
: >> -Hoss
: >> http://www.lucidworks.com/
: >>
: >> -
: >> To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
: >> For additional commands, e-mail: dev-h...@lucene.apache.org
: >>
: >>
: >
: 

-Hoss
http://www.lucidworks.com/

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: TestSSLRandomization is failing everytime

2018-04-04 Thread Joel Bernstein
I suspect I hosed something to do with my root certs on my local machine.
Fairly recently I was playing around with these certs while doing some SSL
work for Alfresco. This should be fun to fix...

Joel Bernstein
http://joelsolr.blogspot.com/

On Wed, Apr 4, 2018 at 12:29 PM, Joel Bernstein  wrote:

> Ok, so it does sounds like a local problem then. Nothing much has changed
> locally. I'm still using the same Mac and Java version:
>
> defaultuildsMBP:clone2 joelbernstein$ java -version
>
> java version "1.8.0_40"
>
> Java(TM) SE Runtime Environment (build 1.8.0_40-b27)
>
> Java HotSpot(TM) 64-Bit Server VM (build 25.40-b25, mixed mode)
>
> I'll try running on a newer version of Java.
>
>
>
> Joel Bernstein
> http://joelsolr.blogspot.com/
>
> On Wed, Apr 4, 2018 at 12:19 PM, Chris Hostetter  > wrote:
>
>>
>> : Subject: Re: TestSSLRandomization is failing everytime
>>
>> : When I run locally I get this stack trace:
>>
>> would be helpful to konw the branch, and the GIT SHA ... and if you can
>> reproduce if you checkout an older branch/SHA where you know you didn't
>> see this failure in the past (ex: the last SHA you committed, where you
>> should have run all tests to be certain you didn't break anything)
>>
>> Personally I can't reproduce on master/8e276b90f520d ...
>>
>> Let's look at the exception...
>>
>> :[junit4]> Caused by: java.lang.RuntimeException: Unable to
>> : initialize 'Default' SSLContext Algorithm, JVM is borked
>> :
>> :[junit4]> at
>> : org.apache.solr.cloud.TestMiniSolrCloudClusterSSL.(T
>> estMiniSolrCloudClusterSSL.java:67)
>> :
>> :[junit4]> ... 40 more
>> :
>> :[junit4]> Caused by: java.security.NoSuchAlgorithmException:
>> Error
>> : constructing implementation (algorithm: Default, provider: SunJSSE,
>> class:
>> : sun.security.ssl.SSLContextImpl$DefaultSSLContext)
>>
>> At first glance, it sounds like your JVM is completley jacked and doesn't
>> have any SSL support?
>>
>> The code throwing that exception is litterally...
>>
>>   DEFAULT_SSL_CONTEXT = SSLContext.getDefault();
>>
>> ...ie: your JVM is saying the *default* SSL Algorithm, as choosen by the
>> JVM config, doens't exist ... but if we look farther down...
>>
>> :[junit4]> Caused by: java.io.IOException: Keystore was tampered
>> : with, or password was incorrect
>> ...
>> :[junit4]> Caused by: java.security.UnrecoverableKeyException:
>> : Password verification failed
>>
>> ...well that's interesting.  We do provide our own keystore when using
>> SSLTestConfig, but I honestly don't know off the top of my head if that's
>> even in use when this code is running?
>>
>> Can you tell us *ANYTHING* about the machine/jvm where you are running
>> this, and or what might have changed on your end since hte last time you
>> ran all tests w/o failure?  what OS? new laptop? new java install? if you
>> "git co releases/lucene-solr/7.2.0" does this test pass? if so can you
>> "git bisect" to track down when it starts failing? etc...
>>
>>
>>
>> -Hoss
>> http://www.lucidworks.com/
>>
>> -
>> To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
>> For additional commands, e-mail: dev-h...@lucene.apache.org
>>
>>
>


[jira] [Comment Edited] (SOLR-7887) Upgrade Solr to use log4j2 -- log4j 1 now officially end of life

2018-04-04 Thread Steve Rowe (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-7887?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16425862#comment-16425862
 ] 

Steve Rowe edited comment on SOLR-7887 at 4/4/18 5:02 PM:
--

Reopening to address Jenkins Maven builds' compilation failures for solr-core 
module, e.g. from [https://builds.apache.org/job/Lucene-Solr-Maven-7.x/172/]:

{noformat}
 [mvn] [INFO] Error for project: Apache Solr Core (during install)
 [mvn] [INFO] 

 [mvn] [INFO] Compilation failure
 [mvn] cannot access org.apache.yetus.audience.InterfaceAudience
 [mvn]   class file for org.apache.yetus.audience.InterfaceAudience not 
found
 [mvn] 
 [mvn] [INFO] 

 [mvn] [INFO] For more information, run Maven with the -e switch
 [mvn] [INFO] 

 [mvn] [INFO] BUILD ERRORS
 [mvn] [INFO] 

 [mvn] [INFO] Total time: 53 seconds
 [mvn] [INFO] Finished at: Wed Apr 04 00:54:42 UTC 2018
 [mvn] [INFO] Final Memory: 297M/1626M
 [mvn] [INFO] 


BUILD FAILED
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-Maven-7.x/build.xml:679: The 
following error occurred while executing this line:
: Java returned: 1
{noformat}


was (Author: steve_rowe):
Reopening to address Jenkins Maven builds' compilation failures for solr-core 
module, e.g.:

{noformat}
 [mvn] [INFO] Error for project: Apache Solr Core (during install)
 [mvn] [INFO] 

 [mvn] [INFO] Compilation failure
 [mvn] cannot access org.apache.yetus.audience.InterfaceAudience
 [mvn]   class file for org.apache.yetus.audience.InterfaceAudience not 
found
 [mvn] 
 [mvn] [INFO] 

 [mvn] [INFO] For more information, run Maven with the -e switch
 [mvn] [INFO] 

 [mvn] [INFO] BUILD ERRORS
 [mvn] [INFO] 

 [mvn] [INFO] Total time: 53 seconds
 [mvn] [INFO] Finished at: Wed Apr 04 00:54:42 UTC 2018
 [mvn] [INFO] Final Memory: 297M/1626M
 [mvn] [INFO] 


BUILD FAILED
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-Maven-7.x/build.xml:679: The 
following error occurred while executing this line:
: Java returned: 1
{noformat}

> Upgrade Solr to use log4j2 -- log4j 1 now officially end of life
> 
>
> Key: SOLR-7887
> URL: https://issues.apache.org/jira/browse/SOLR-7887
> Project: Solr
>  Issue Type: Task
>Reporter: Shawn Heisey
>Assignee: Erick Erickson
>Priority: Major
> Fix For: 7.4
>
> Attachments: SOLR-7887-WIP.patch, SOLR-7887-eoe-review.patch, 
> SOLR-7887-eoe-review.patch, SOLR-7887-followup_1.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887_followup_2.patch, SOLR-7887_followup_2.patch
>
>
> The logging services project has officially announced the EOL of log4j 1:
> https://blogs.apache.org/foundation/entry/apache_logging_services_project_announces
> In the official binary jetty deployment, we use use log4j 1.2 as our final 
> logging destination, so the admin UI has a log watcher that actually uses 
> log4j and java.util.logging classes.  That will need to be extended to add 
> log4j2.  I think that might be the largest pain point to this upgrade.
> There is some crossover between log4j2 and slf4j.  Figuring out exactly which 
> jars need to be in the lib/ext directory will take some research.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Reopened] (SOLR-7887) Upgrade Solr to use log4j2 -- log4j 1 now officially end of life

2018-04-04 Thread Steve Rowe (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-7887?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Steve Rowe reopened SOLR-7887:
--

Reopening to address Jenkins Maven builds' compilation failures for solr-core 
module, e.g.:

{noformat}
 [mvn] [INFO] Error for project: Apache Solr Core (during install)
 [mvn] [INFO] 

 [mvn] [INFO] Compilation failure
 [mvn] cannot access org.apache.yetus.audience.InterfaceAudience
 [mvn]   class file for org.apache.yetus.audience.InterfaceAudience not 
found
 [mvn] 
 [mvn] [INFO] 

 [mvn] [INFO] For more information, run Maven with the -e switch
 [mvn] [INFO] 

 [mvn] [INFO] BUILD ERRORS
 [mvn] [INFO] 

 [mvn] [INFO] Total time: 53 seconds
 [mvn] [INFO] Finished at: Wed Apr 04 00:54:42 UTC 2018
 [mvn] [INFO] Final Memory: 297M/1626M
 [mvn] [INFO] 


BUILD FAILED
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-Maven-7.x/build.xml:679: The 
following error occurred while executing this line:
: Java returned: 1
{noformat}

> Upgrade Solr to use log4j2 -- log4j 1 now officially end of life
> 
>
> Key: SOLR-7887
> URL: https://issues.apache.org/jira/browse/SOLR-7887
> Project: Solr
>  Issue Type: Task
>Reporter: Shawn Heisey
>Assignee: Erick Erickson
>Priority: Major
> Fix For: 7.4
>
> Attachments: SOLR-7887-WIP.patch, SOLR-7887-eoe-review.patch, 
> SOLR-7887-eoe-review.patch, SOLR-7887-followup_1.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887_followup_2.patch, SOLR-7887_followup_2.patch
>
>
> The logging services project has officially announced the EOL of log4j 1:
> https://blogs.apache.org/foundation/entry/apache_logging_services_project_announces
> In the official binary jetty deployment, we use use log4j 1.2 as our final 
> logging destination, so the admin UI has a log watcher that actually uses 
> log4j and java.util.logging classes.  That will need to be extended to add 
> log4j2.  I think that might be the largest pain point to this upgrade.
> There is some crossover between log4j2 and slf4j.  Figuring out exactly which 
> jars need to be in the lib/ext directory will take some research.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Created] (SOLR-12188) Inconsistent behavior with CREATE collection API

2018-04-04 Thread Munendra S N (JIRA)
Munendra S N created SOLR-12188:
---

 Summary: Inconsistent behavior with CREATE collection API
 Key: SOLR-12188
 URL: https://issues.apache.org/jira/browse/SOLR-12188
 Project: Solr
  Issue Type: Bug
  Security Level: Public (Default Security Level. Issues are Public)
  Components: Admin UI, config-api
Affects Versions: 7.2
Reporter: Munendra S N


If collection.configName is not specified during create collection then 
_default configSet is used to create mutable configSet (with suffix AUTOCREATED)
* In the Admin UI, it is mandatory to specify configSet. This behavior is 
inconsistent with CREATE collection API(where it is not mandatory)
* Both in Admin UI and CREATE API, when _default is specified as configSet then 
no mutable configSet is created. So, changes in one collection would reflect in 
other



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-NightlyTests-7.x - Build # 191 - Failure

2018-04-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-NightlyTests-7.x/191/

7 tests failed.
FAILED:  org.apache.solr.uninverting.TestDocTermOrds.testTriggerUnInvertLimit

Error Message:
Java heap space

Stack Trace:
java.lang.OutOfMemoryError: Java heap space
at 
__randomizedtesting.SeedInfo.seed([A361676B624767C5:90D34FAF6FF0BD72]:0)
at org.apache.lucene.store.RAMFile.newBuffer(RAMFile.java:78)
at org.apache.lucene.store.RAMFile.addBuffer(RAMFile.java:51)
at 
org.apache.lucene.store.RAMOutputStream.switchCurrentBuffer(RAMOutputStream.java:164)
at 
org.apache.lucene.store.RAMOutputStream.writeBytes(RAMOutputStream.java:150)
at 
org.apache.lucene.store.MockIndexOutputWrapper.writeBytes(MockIndexOutputWrapper.java:139)
at 
org.apache.lucene.codecs.lucene50.Lucene50PostingsWriter.addPosition(Lucene50PostingsWriter.java:291)
at 
org.apache.lucene.codecs.PushPostingsWriterBase.writeTerm(PushPostingsWriterBase.java:156)
at 
org.apache.lucene.codecs.blocktree.BlockTreeTermsWriter$TermsWriter.write(BlockTreeTermsWriter.java:864)
at 
org.apache.lucene.codecs.blocktree.BlockTreeTermsWriter.write(BlockTreeTermsWriter.java:343)
at 
org.apache.lucene.codecs.FieldsConsumer.merge(FieldsConsumer.java:105)
at 
org.apache.lucene.codecs.perfield.PerFieldPostingsFormat$FieldsWriter.merge(PerFieldPostingsFormat.java:164)
at 
org.apache.lucene.index.SegmentMerger.mergeTerms(SegmentMerger.java:230)
at org.apache.lucene.index.SegmentMerger.merge(SegmentMerger.java:115)
at 
org.apache.lucene.index.IndexWriter.mergeMiddle(IndexWriter.java:4480)
at org.apache.lucene.index.IndexWriter.merge(IndexWriter.java:4141)
at 
org.apache.lucene.index.SerialMergeScheduler.merge(SerialMergeScheduler.java:40)
at org.apache.lucene.index.IndexWriter.maybeMerge(IndexWriter.java:2335)
at 
org.apache.lucene.index.IndexWriter.processEvents(IndexWriter.java:5134)
at 
org.apache.lucene.index.IndexWriter.updateDocument(IndexWriter.java:1779)
at 
org.apache.lucene.index.IndexWriter.addDocument(IndexWriter.java:1468)
at 
org.apache.lucene.index.RandomIndexWriter.addDocument(RandomIndexWriter.java:185)
at 
org.apache.solr.uninverting.TestDocTermOrds.testTriggerUnInvertLimit(TestDocTermOrds.java:167)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)


FAILED:  org.apache.solr.uninverting.TestDocTermOrds.testRandom

Error Message:
Java heap space

Stack Trace:
java.lang.OutOfMemoryError: Java heap space
at 
__randomizedtesting.SeedInfo.seed([A361676B624767C5:D12D4264D327D1B6]:0)
at org.apache.lucene.util.fst.BytesStore.writeByte(BytesStore.java:89)
at org.apache.lucene.util.fst.FST.(FST.java:265)
at org.apache.lucene.util.fst.Builder.(Builder.java:157)
at 
org.apache.lucene.codecs.blocktree.BlockTreeTermsWriter$PendingBlock.compileIndex(BlockTreeTermsWriter.java:456)
at 
org.apache.lucene.codecs.blocktree.BlockTreeTermsWriter$TermsWriter.writeBlocks(BlockTreeTermsWriter.java:633)
at 
org.apache.lucene.codecs.blocktree.BlockTreeTermsWriter$TermsWriter.finish(BlockTreeTermsWriter.java:934)
at 
org.apache.lucene.codecs.blocktree.BlockTreeTermsWriter.write(BlockTreeTermsWriter.java:346)
at 
org.apache.lucene.codecs.perfield.PerFieldPostingsFormat$FieldsWriter.write(PerFieldPostingsFormat.java:140)
at 
org.apache.lucene.index.FreqProxTermsWriter.flush(FreqProxTermsWriter.java:108)
at 
org.apache.lucene.index.DefaultIndexingChain.flush(DefaultIndexingChain.java:163)
at 
org.apache.lucene.index.DocumentsWriterPerThread.flush(DocumentsWriterPerThread.java:463)
at 
org.apache.lucene.index.DocumentsWriter.doFlush(DocumentsWriter.java:556)
at 
org.apache.lucene.index.DocumentsWriter.postUpdate(DocumentsWriter.java:416)
at 
org.apache.lucene.index.DocumentsWriter.updateDocuments(DocumentsWriter.java:473)
at 
org.apache.lucene.index.IndexWriter.updateDocuments(IndexWriter.java:1539)
at 

Re: [JENKINS] Lucene-Solr-SmokeRelease-master - Build # 995 - Still Failing

2018-04-04 Thread Jan Høydahl
I’ll dig..

Sendt fra min iPhone

> 4. apr. 2018 kl. 17:58 skrev Chris Hostetter :
> 
> 
> This seems to be realted to LUCENE-7935?
> 
> I've re-opened & commented there about this failed smoketester.
> 
> 
> : Date: Wed, 4 Apr 2018 10:14:36 + (UTC)
> : From: Apache Jenkins Server 
> : Reply-To: dev@lucene.apache.org
> : To: dev@lucene.apache.org
> : Subject: [JENKINS] Lucene-Solr-SmokeRelease-master - Build # 995 - Still
> : Failing
> : 
> : Build: https://builds.apache.org/job/Lucene-Solr-SmokeRelease-master/995/
> : 
> : No tests ran.
> : 
> : Build Log:
> : [...truncated 23735 lines...]
> : [asciidoctor:convert] asciidoctor: ERROR: about-this-guide.adoc: line 1: 
> invalid part, must have at least one section (e.g., chapter, appendix, etc.)
> : [asciidoctor:convert] asciidoctor: ERROR: solr-glossary.adoc: line 1: 
> invalid part, must have at least one section (e.g., chapter, appendix, etc.)
> :  [java] Processed 2188 links (1744 relative) to 3004 anchors in 243 
> files
> :  [echo] Validated Links & Anchors via: 
> /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/build/solr-ref-guide/bare-bones-html/
> : 
> : -dist-changes:
> :  [copy] Copying 4 files to 
> /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/package/changes
> : 
> : -dist-keys:
> :   [get] Getting: http://home.apache.org/keys/group/lucene.asc
> :   [get] To: 
> /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/package/KEYS
> : 
> : package:
> : 
> : -unpack-solr-tgz:
> : 
> : -ensure-solr-tgz-exists:
> : [mkdir] Created dir: 
> /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/build/solr.tgz.unpacked
> : [untar] Expanding: 
> /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/package/solr-8.0.0.tgz
>  into 
> /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/build/solr.tgz.unpacked
> : 
> : generate-maven-artifacts:
> : 
> : resolve:
> : 
> : resolve:
> : 
> : ivy-availability-check:
> : [loadresource] Do not set property disallowed.ivy.jars.list as its length 
> is 0.
> : 
> : -ivy-fail-disallowed-ivy-version:
> : 
> : ivy-fail:
> : 
> : ivy-configure:
> : [ivy:configure] :: loading settings :: file = 
> /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml
> : 
> : resolve:
> : 
> : ivy-availability-check:
> : [loadresource] Do not set property disallowed.ivy.jars.list as its length 
> is 0.
> : 
> : -ivy-fail-disallowed-ivy-version:
> : 
> : ivy-fail:
> : 
> : ivy-configure:
> : [ivy:configure] :: loading settings :: file = 
> /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml
> : 
> : resolve:
> : 
> : resolve:
> : 
> : ivy-availability-check:
> : [loadresource] Do not set property disallowed.ivy.jars.list as its length 
> is 0.
> : 
> : -ivy-fail-disallowed-ivy-version:
> : 
> : ivy-fail:
> : 
> : ivy-configure:
> : [ivy:configure] :: loading settings :: file = 
> /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml
> : 
> : resolve:
> : 
> : ivy-availability-check:
> : [loadresource] Do not set property disallowed.ivy.jars.list as its length 
> is 0.
> : 
> : -ivy-fail-disallowed-ivy-version:
> : 
> : ivy-fail:
> : 
> : ivy-configure:
> : [ivy:configure] :: loading settings :: file = 
> /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml
> : 
> : resolve:
> : 
> : ivy-availability-check:
> : [loadresource] Do not set property disallowed.ivy.jars.list as its length 
> is 0.
> : 
> : -ivy-fail-disallowed-ivy-version:
> : 
> : ivy-fail:
> : 
> : ivy-configure:
> : [ivy:configure] :: loading settings :: file = 
> /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml
> : 
> : resolve:
> : 
> : ivy-availability-check:
> : [loadresource] Do not set property disallowed.ivy.jars.list as its length 
> is 0.
> : 
> : -ivy-fail-disallowed-ivy-version:
> : 
> : ivy-fail:
> : 
> : ivy-configure:
> : [ivy:configure] :: loading settings :: file = 
> /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml
> : 
> : resolve:
> : 
> : ivy-availability-check:
> : [loadresource] Do not set property disallowed.ivy.jars.list as its length 
> is 0.
> : 
> : -ivy-fail-disallowed-ivy-version:
> : 
> : ivy-fail:
> : 
> : ivy-configure:
> : [ivy:configure] :: loading settings :: file = 
> /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml
> : 
> : resolve:
> : 
> : ivy-availability-check:
> : [loadresource] Do not set property disallowed.ivy.jars.list as its length 
> is 0.
> : 
> : -ivy-fail-disallowed-ivy-version:
> : 
> : ivy-fail:
> : 
> : ivy-configure:
> : [ivy:configure] :: loading settings 

[jira] [Commented] (SOLR-11982) Add support for indicating preferred replica types for queries

2018-04-04 Thread Houston Putman (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-11982?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16425820#comment-16425820
 ] 

Houston Putman commented on SOLR-11982:
---

Just my two cents, but I think that {{shards.preference}} is a lot clearer than 
{{shards.sort}} for this kind of option. {{sort}} is a defined parameter in 
Solr and I think using it in this context would muddy the term else where in 
Solr as well as add confusion here. This is a new type of behavior for Solr and 
therefore requires a new parameter name. I think {{shards.preference}} would 
work well in differentiating it as a new feature.

{{shards.routing}} or {{shards.route}} may be even clearer names as they 
explicitly show that they are options for routing a request. The words 
{{preference}} and {{sort}} don't inherently relate to routing, though 
{{preference}} does come closer.

> Add support for indicating preferred replica types for queries
> --
>
> Key: SOLR-11982
> URL: https://issues.apache.org/jira/browse/SOLR-11982
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: SolrCloud
>Affects Versions: 7.4, master (8.0)
>Reporter: Ere Maijala
>Priority: Minor
>  Labels: patch-available, patch-with-test
> Attachments: SOLR-11982-preferReplicaTypes.patch, 
> SOLR-11982-preferReplicaTypes.patch, SOLR-11982.patch, SOLR-11982.patch, 
> SOLR-11982.patch, SOLR-11982.patch, SOLR-11982.patch, SOLR-11982.patch, 
> SOLR-11982.patch, SOLR-11982.patch
>
>
> It would be nice to have the possibility to easily sort the shards in the 
> preferred order e.g. by replica type. The attached patch adds support for 
> {{shards.sort}} parameter that allows one to sort e.g. PULL and TLOG replicas 
> first with \{{shards.sort=replicaType:PULL|TLOG }}(which would mean that NRT 
> replicas wouldn't be hit with queries unless they're the only ones available) 
> and/or to sort by replica location (like preferLocalShards=true but more 
> versatile).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: TestSSLRandomization is failing everytime

2018-04-04 Thread Joel Bernstein
Ok, so it does sounds like a local problem then. Nothing much has changed
locally. I'm still using the same Mac and Java version:

defaultuildsMBP:clone2 joelbernstein$ java -version

java version "1.8.0_40"

Java(TM) SE Runtime Environment (build 1.8.0_40-b27)

Java HotSpot(TM) 64-Bit Server VM (build 25.40-b25, mixed mode)

I'll try running on a newer version of Java.



Joel Bernstein
http://joelsolr.blogspot.com/

On Wed, Apr 4, 2018 at 12:19 PM, Chris Hostetter 
wrote:

>
> : Subject: Re: TestSSLRandomization is failing everytime
>
> : When I run locally I get this stack trace:
>
> would be helpful to konw the branch, and the GIT SHA ... and if you can
> reproduce if you checkout an older branch/SHA where you know you didn't
> see this failure in the past (ex: the last SHA you committed, where you
> should have run all tests to be certain you didn't break anything)
>
> Personally I can't reproduce on master/8e276b90f520d ...
>
> Let's look at the exception...
>
> :[junit4]> Caused by: java.lang.RuntimeException: Unable to
> : initialize 'Default' SSLContext Algorithm, JVM is borked
> :
> :[junit4]> at
> : org.apache.solr.cloud.TestMiniSolrCloudClusterSSL.(
> TestMiniSolrCloudClusterSSL.java:67)
> :
> :[junit4]> ... 40 more
> :
> :[junit4]> Caused by: java.security.NoSuchAlgorithmException:
> Error
> : constructing implementation (algorithm: Default, provider: SunJSSE,
> class:
> : sun.security.ssl.SSLContextImpl$DefaultSSLContext)
>
> At first glance, it sounds like your JVM is completley jacked and doesn't
> have any SSL support?
>
> The code throwing that exception is litterally...
>
>   DEFAULT_SSL_CONTEXT = SSLContext.getDefault();
>
> ...ie: your JVM is saying the *default* SSL Algorithm, as choosen by the
> JVM config, doens't exist ... but if we look farther down...
>
> :[junit4]> Caused by: java.io.IOException: Keystore was tampered
> : with, or password was incorrect
> ...
> :[junit4]> Caused by: java.security.UnrecoverableKeyException:
> : Password verification failed
>
> ...well that's interesting.  We do provide our own keystore when using
> SSLTestConfig, but I honestly don't know off the top of my head if that's
> even in use when this code is running?
>
> Can you tell us *ANYTHING* about the machine/jvm where you are running
> this, and or what might have changed on your end since hte last time you
> ran all tests w/o failure?  what OS? new laptop? new java install? if you
> "git co releases/lucene-solr/7.2.0" does this test pass? if so can you
> "git bisect" to track down when it starts failing? etc...
>
>
>
> -Hoss
> http://www.lucidworks.com/
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
> For additional commands, e-mail: dev-h...@lucene.apache.org
>
>


Re: TestSSLRandomization is failing everytime

2018-04-04 Thread Chris Hostetter

: Subject: Re: TestSSLRandomization is failing everytime

: When I run locally I get this stack trace:

would be helpful to konw the branch, and the GIT SHA ... and if you can 
reproduce if you checkout an older branch/SHA where you know you didn't 
see this failure in the past (ex: the last SHA you committed, where you 
should have run all tests to be certain you didn't break anything)

Personally I can't reproduce on master/8e276b90f520d ...

Let's look at the exception...

:[junit4]> Caused by: java.lang.RuntimeException: Unable to
: initialize 'Default' SSLContext Algorithm, JVM is borked
: 
:[junit4]> at
: 
org.apache.solr.cloud.TestMiniSolrCloudClusterSSL.(TestMiniSolrCloudClusterSSL.java:67)
: 
:[junit4]> ... 40 more
: 
:[junit4]> Caused by: java.security.NoSuchAlgorithmException: Error
: constructing implementation (algorithm: Default, provider: SunJSSE, class:
: sun.security.ssl.SSLContextImpl$DefaultSSLContext)

At first glance, it sounds like your JVM is completley jacked and doesn't 
have any SSL support?

The code throwing that exception is litterally...

  DEFAULT_SSL_CONTEXT = SSLContext.getDefault();

...ie: your JVM is saying the *default* SSL Algorithm, as choosen by the 
JVM config, doens't exist ... but if we look farther down...

:[junit4]> Caused by: java.io.IOException: Keystore was tampered
: with, or password was incorrect
...
:[junit4]> Caused by: java.security.UnrecoverableKeyException:
: Password verification failed

...well that's interesting.  We do provide our own keystore when using 
SSLTestConfig, but I honestly don't know off the top of my head if that's 
even in use when this code is running? 

Can you tell us *ANYTHING* about the machine/jvm where you are running 
this, and or what might have changed on your end since hte last time you 
ran all tests w/o failure?  what OS? new laptop? new java install? if you 
"git co releases/lucene-solr/7.2.0" does this test pass? if so can you 
"git bisect" to track down when it starts failing? etc...



-Hoss
http://www.lucidworks.com/

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: TestSSLRandomization is failing everytime

2018-04-04 Thread Kevin Risden
What Java version are you using? It's almost like it can't find the Java
cipher?

Kevin Risden

On Wed, Apr 4, 2018, 11:14 Joel Bernstein  wrote:

> I've looked through the recent commits but don't see anything that looks
> like it might have caused this. I don't see jenkins errors yet either.
>
> I'm running my tests on a fresh clone so I don't think this is due to an
> issue with my local repo.
>
> Anybody else seeing this problem locally?
>
> Joel Bernstein
> http://joelsolr.blogspot.com/
>
> On Wed, Apr 4, 2018 at 11:59 AM, Joel Bernstein 
> wrote:
>
>> When I run locally I get this stack trace:
>>
>> ERROR   0.02s | TestSSLRandomization.testRandomizedSslAndClientAuth <<<
>>
>>[junit4]> Throwable #1: java.lang.ExceptionInInitializerError
>>
>>[junit4]> at
>> __randomizedtesting.SeedInfo.seed([59B26B23CF90404E:D2E6DA446D0F0132]:0)
>>
>>[junit4]> at
>> org.apache.solr.cloud.TestSSLRandomization.testRandomizedSslAndClientAuth(TestSSLRandomization.java:50)
>>
>>[junit4]> at java.lang.Thread.run(Thread.java:745)
>>
>>[junit4]> Caused by: java.lang.RuntimeException: Unable to
>> initialize 'Default' SSLContext Algorithm, JVM is borked
>>
>>[junit4]> at
>> org.apache.solr.cloud.TestMiniSolrCloudClusterSSL.(TestMiniSolrCloudClusterSSL.java:67)
>>
>>[junit4]> ... 40 more
>>
>>[junit4]> Caused by: java.security.NoSuchAlgorithmException:
>> Error constructing implementation (algorithm: Default, provider: SunJSSE,
>> class: sun.security.ssl.SSLContextImpl$DefaultSSLContext)
>>
>>[junit4]> at
>> java.security.Provider$Service.newInstance(Provider.java:1617)
>>
>>[junit4]> at
>> sun.security.jca.GetInstance.getInstance(GetInstance.java:236)
>>
>>[junit4]> at
>> sun.security.jca.GetInstance.getInstance(GetInstance.java:164)
>>
>>[junit4]> at
>> javax.net.ssl.SSLContext.getInstance(SSLContext.java:156)
>>
>>[junit4]> at
>> javax.net.ssl.SSLContext.getDefault(SSLContext.java:96)
>>
>>[junit4]> at
>> org.apache.solr.cloud.TestMiniSolrCloudClusterSSL.(TestMiniSolrCloudClusterSSL.java:64)
>>
>>[junit4]> ... 40 more
>>
>>[junit4]> Caused by: java.io.IOException: Keystore was tampered
>> with, or password was incorrect
>>
>>[junit4]> at
>> sun.security.provider.JavaKeyStore.engineLoad(JavaKeyStore.java:772)
>>
>>[junit4]> at
>> sun.security.provider.JavaKeyStore$JKS.engineLoad(JavaKeyStore.java:55)
>>
>>[junit4]> at java.security.KeyStore.load(KeyStore.java:1445)
>>
>>[junit4]> at
>> sun.security.ssl.TrustManagerFactoryImpl.getCacertsKeyStore(TrustManagerFactoryImpl.java:226)
>>
>>[junit4]> at
>> sun.security.ssl.SSLContextImpl$DefaultSSLContext.getDefaultTrustManager(SSLContextImpl.java:767)
>>
>>[junit4]> at
>> sun.security.ssl.SSLContextImpl$DefaultSSLContext.(SSLContextImpl.java:733)
>>
>>[junit4]> at
>> java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>>
>>[junit4]> at
>> java.security.Provider$Service.newInstance(Provider.java:1595)
>>
>>[junit4]> ... 45 more
>>
>>[junit4]> Caused by: java.security.UnrecoverableKeyException:
>> Password verification failed
>>
>>[junit4]> at
>> sun.security.provider.JavaKeyStore.engineLoad(JavaKeyStore.java:770)
>>
>>[junit4]> ... 55 more
>>
>>
>> Joel Bernstein
>> http://joelsolr.blogspot.com/
>>
>> On Wed, Apr 4, 2018 at 11:54 AM, Joel Bernstein 
>> wrote:
>>
>>> TestSSLRandomization is failing 100% of the time. Anybody make changes
>>> recently to this code?
>>>
>>
>>
>


Re: TestSSLRandomization is failing everytime

2018-04-04 Thread Joel Bernstein
I've looked through the recent commits but don't see anything that looks
like it might have caused this. I don't see jenkins errors yet either.

I'm running my tests on a fresh clone so I don't think this is due to an
issue with my local repo.

Anybody else seeing this problem locally?

Joel Bernstein
http://joelsolr.blogspot.com/

On Wed, Apr 4, 2018 at 11:59 AM, Joel Bernstein  wrote:

> When I run locally I get this stack trace:
>
> ERROR   0.02s | TestSSLRandomization.testRandomizedSslAndClientAuth <<<
>
>[junit4]> Throwable #1: java.lang.ExceptionInInitializerError
>
>[junit4]> at __randomizedtesting.SeedInfo.seed([59B26B23CF90404E:
> D2E6DA446D0F0132]:0)
>
>[junit4]> at org.apache.solr.cloud.TestSSLRandomization.
> testRandomizedSslAndClientAuth(TestSSLRandomization.java:50)
>
>[junit4]> at java.lang.Thread.run(Thread.java:745)
>
>[junit4]> Caused by: java.lang.RuntimeException: Unable to
> initialize 'Default' SSLContext Algorithm, JVM is borked
>
>[junit4]> at org.apache.solr.cloud.TestMiniSolrCloudClusterSSL.<
> clinit>(TestMiniSolrCloudClusterSSL.java:67)
>
>[junit4]> ... 40 more
>
>[junit4]> Caused by: java.security.NoSuchAlgorithmException: Error
> constructing implementation (algorithm: Default, provider: SunJSSE, class:
> sun.security.ssl.SSLContextImpl$DefaultSSLContext)
>
>[junit4]> at java.security.Provider$Service.newInstance(Provider.
> java:1617)
>
>[junit4]> at sun.security.jca.GetInstance.
> getInstance(GetInstance.java:236)
>
>[junit4]> at sun.security.jca.GetInstance.
> getInstance(GetInstance.java:164)
>
>[junit4]> at javax.net.ssl.SSLContext.getInstance(SSLContext.java:
> 156)
>
>[junit4]> at javax.net.ssl.SSLContext.
> getDefault(SSLContext.java:96)
>
>[junit4]> at org.apache.solr.cloud.TestMiniSolrCloudClusterSSL.<
> clinit>(TestMiniSolrCloudClusterSSL.java:64)
>
>[junit4]> ... 40 more
>
>[junit4]> Caused by: java.io.IOException: Keystore was tampered
> with, or password was incorrect
>
>[junit4]> at sun.security.provider.JavaKeyStore.engineLoad(
> JavaKeyStore.java:772)
>
>[junit4]> at sun.security.provider.JavaKeyStore$JKS.engineLoad(
> JavaKeyStore.java:55)
>
>[junit4]> at java.security.KeyStore.load(KeyStore.java:1445)
>
>[junit4]> at sun.security.ssl.TrustManagerFactoryImpl.
> getCacertsKeyStore(TrustManagerFactoryImpl.java:226)
>
>[junit4]> at sun.security.ssl.SSLContextImpl$DefaultSSLContext.
> getDefaultTrustManager(SSLContextImpl.java:767)
>
>[junit4]> at sun.security.ssl.SSLContextImpl$
> DefaultSSLContext.(SSLContextImpl.java:733)
>
>[junit4]> at java.lang.reflect.Constructor.
> newInstance(Constructor.java:422)
>
>[junit4]> at java.security.Provider$Service.newInstance(Provider.
> java:1595)
>
>[junit4]> ... 45 more
>
>[junit4]> Caused by: java.security.UnrecoverableKeyException:
> Password verification failed
>
>[junit4]> at sun.security.provider.JavaKeyStore.engineLoad(
> JavaKeyStore.java:770)
>
>[junit4]> ... 55 more
>
>
> Joel Bernstein
> http://joelsolr.blogspot.com/
>
> On Wed, Apr 4, 2018 at 11:54 AM, Joel Bernstein 
> wrote:
>
>> TestSSLRandomization is failing 100% of the time. Anybody make changes
>> recently to this code?
>>
>
>


Re: TestSSLRandomization is failing everytime

2018-04-04 Thread Joel Bernstein
When I run locally I get this stack trace:

ERROR   0.02s | TestSSLRandomization.testRandomizedSslAndClientAuth <<<

   [junit4]> Throwable #1: java.lang.ExceptionInInitializerError

   [junit4]> at
__randomizedtesting.SeedInfo.seed([59B26B23CF90404E:D2E6DA446D0F0132]:0)

   [junit4]> at
org.apache.solr.cloud.TestSSLRandomization.testRandomizedSslAndClientAuth(TestSSLRandomization.java:50)

   [junit4]> at java.lang.Thread.run(Thread.java:745)

   [junit4]> Caused by: java.lang.RuntimeException: Unable to
initialize 'Default' SSLContext Algorithm, JVM is borked

   [junit4]> at
org.apache.solr.cloud.TestMiniSolrCloudClusterSSL.(TestMiniSolrCloudClusterSSL.java:67)

   [junit4]> ... 40 more

   [junit4]> Caused by: java.security.NoSuchAlgorithmException: Error
constructing implementation (algorithm: Default, provider: SunJSSE, class:
sun.security.ssl.SSLContextImpl$DefaultSSLContext)

   [junit4]> at
java.security.Provider$Service.newInstance(Provider.java:1617)

   [junit4]> at
sun.security.jca.GetInstance.getInstance(GetInstance.java:236)

   [junit4]> at
sun.security.jca.GetInstance.getInstance(GetInstance.java:164)

   [junit4]> at
javax.net.ssl.SSLContext.getInstance(SSLContext.java:156)

   [junit4]> at javax.net.ssl.SSLContext.getDefault(SSLContext.java:96)

   [junit4]> at
org.apache.solr.cloud.TestMiniSolrCloudClusterSSL.(TestMiniSolrCloudClusterSSL.java:64)

   [junit4]> ... 40 more

   [junit4]> Caused by: java.io.IOException: Keystore was tampered
with, or password was incorrect

   [junit4]> at
sun.security.provider.JavaKeyStore.engineLoad(JavaKeyStore.java:772)

   [junit4]> at
sun.security.provider.JavaKeyStore$JKS.engineLoad(JavaKeyStore.java:55)

   [junit4]> at java.security.KeyStore.load(KeyStore.java:1445)

   [junit4]> at
sun.security.ssl.TrustManagerFactoryImpl.getCacertsKeyStore(TrustManagerFactoryImpl.java:226)

   [junit4]> at
sun.security.ssl.SSLContextImpl$DefaultSSLContext.getDefaultTrustManager(SSLContextImpl.java:767)

   [junit4]> at
sun.security.ssl.SSLContextImpl$DefaultSSLContext.(SSLContextImpl.java:733)

   [junit4]> at
java.lang.reflect.Constructor.newInstance(Constructor.java:422)

   [junit4]> at
java.security.Provider$Service.newInstance(Provider.java:1595)

   [junit4]> ... 45 more

   [junit4]> Caused by: java.security.UnrecoverableKeyException:
Password verification failed

   [junit4]> at
sun.security.provider.JavaKeyStore.engineLoad(JavaKeyStore.java:770)

   [junit4]> ... 55 more


Joel Bernstein
http://joelsolr.blogspot.com/

On Wed, Apr 4, 2018 at 11:54 AM, Joel Bernstein  wrote:

> TestSSLRandomization is failing 100% of the time. Anybody make changes
> recently to this code?
>


Re: [JENKINS] Lucene-Solr-SmokeRelease-master - Build # 995 - Still Failing

2018-04-04 Thread Chris Hostetter

This seems to be realted to LUCENE-7935?

I've re-opened & commented there about this failed smoketester.


: Date: Wed, 4 Apr 2018 10:14:36 + (UTC)
: From: Apache Jenkins Server 
: Reply-To: dev@lucene.apache.org
: To: dev@lucene.apache.org
: Subject: [JENKINS] Lucene-Solr-SmokeRelease-master - Build # 995 - Still
: Failing
: 
: Build: https://builds.apache.org/job/Lucene-Solr-SmokeRelease-master/995/
: 
: No tests ran.
: 
: Build Log:
: [...truncated 23735 lines...]
: [asciidoctor:convert] asciidoctor: ERROR: about-this-guide.adoc: line 1: 
invalid part, must have at least one section (e.g., chapter, appendix, etc.)
: [asciidoctor:convert] asciidoctor: ERROR: solr-glossary.adoc: line 1: invalid 
part, must have at least one section (e.g., chapter, appendix, etc.)
:  [java] Processed 2188 links (1744 relative) to 3004 anchors in 243 files
:  [echo] Validated Links & Anchors via: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/build/solr-ref-guide/bare-bones-html/
: 
: -dist-changes:
:  [copy] Copying 4 files to 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/package/changes
: 
: -dist-keys:
:   [get] Getting: http://home.apache.org/keys/group/lucene.asc
:   [get] To: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/package/KEYS
: 
: package:
: 
: -unpack-solr-tgz:
: 
: -ensure-solr-tgz-exists:
: [mkdir] Created dir: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/build/solr.tgz.unpacked
: [untar] Expanding: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/package/solr-8.0.0.tgz
 into 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/solr/build/solr.tgz.unpacked
: 
: generate-maven-artifacts:
: 
: resolve:
: 
: resolve:
: 
: ivy-availability-check:
: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 
0.
: 
: -ivy-fail-disallowed-ivy-version:
: 
: ivy-fail:
: 
: ivy-configure:
: [ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml
: 
: resolve:
: 
: ivy-availability-check:
: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 
0.
: 
: -ivy-fail-disallowed-ivy-version:
: 
: ivy-fail:
: 
: ivy-configure:
: [ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml
: 
: resolve:
: 
: resolve:
: 
: ivy-availability-check:
: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 
0.
: 
: -ivy-fail-disallowed-ivy-version:
: 
: ivy-fail:
: 
: ivy-configure:
: [ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml
: 
: resolve:
: 
: ivy-availability-check:
: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 
0.
: 
: -ivy-fail-disallowed-ivy-version:
: 
: ivy-fail:
: 
: ivy-configure:
: [ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml
: 
: resolve:
: 
: ivy-availability-check:
: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 
0.
: 
: -ivy-fail-disallowed-ivy-version:
: 
: ivy-fail:
: 
: ivy-configure:
: [ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml
: 
: resolve:
: 
: ivy-availability-check:
: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 
0.
: 
: -ivy-fail-disallowed-ivy-version:
: 
: ivy-fail:
: 
: ivy-configure:
: [ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml
: 
: resolve:
: 
: ivy-availability-check:
: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 
0.
: 
: -ivy-fail-disallowed-ivy-version:
: 
: ivy-fail:
: 
: ivy-configure:
: [ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml
: 
: resolve:
: 
: ivy-availability-check:
: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 
0.
: 
: -ivy-fail-disallowed-ivy-version:
: 
: ivy-fail:
: 
: ivy-configure:
: [ivy:configure] :: loading settings :: file = 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/top-level-ivy-settings.xml
: 
: resolve:
: 
: ivy-availability-check:
: [loadresource] Do not set property disallowed.ivy.jars.list as its length is 
0.
: 
: -ivy-fail-disallowed-ivy-version:
: 
: ivy-fail:
: 
: ivy-configure:
: [ivy:configure] :: loading settings :: file = 

[jira] [Reopened] (LUCENE-7935) Release .sha512 hash files with our artifacts

2018-04-04 Thread Hoss Man (JIRA)

 [ 
https://issues.apache.org/jira/browse/LUCENE-7935?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hoss Man reopened LUCENE-7935:
--

Looks like smoketester isn't happy with something about the new sha512 files?

not really clear what the problem is, but it seemed worth re-opening here to 
raise attention to it...

https://builds.apache.org/job/Lucene-Solr-SmokeRelease-master/995/consoleText
{noformat}
   [smoker] verify that each binary artifact has a deployed POM...
   [smoker] verify that there is an artifact for each POM template...
   [smoker] verify Maven artifacts' sha1/sha512 digests...
   [smoker] Traceback (most recent call last):
   [smoker]   File 
"/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/dev-tools/scripts/smokeTestRelease.py",
 line 1524, in 
   [smoker] main()
   [smoker]   File 
"/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/dev-tools/scripts/smokeTestRelease.py",
 line 1465, in main
   [smoker] smokeTest(c.java, c.url, c.revision, c.version, c.tmp_dir, 
c.is_signed, ' '.join(c.test_args))
   [smoker]   File 
"/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/dev-tools/scripts/smokeTestRelease.py",
 line 1518, in smokeTest
   [smoker] checkMaven(solrSrcUnpackPath, baseURL, tmpDir, gitRevision, 
version, isSigned)
   [smoker]   File 
"/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/dev-tools/scripts/smokeTestRelease.py",
 line 994, in checkMaven
   [smoker] verifyMavenDigests(artifacts)
   [smoker]   File 
"/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/dev-tools/scripts/smokeTestRelease.py",
 line 1081, in verifyMavenDigests
   [smoker] raise RuntimeError('missing: SHA512 digest for %s' % 
artifactFile)
   [smoker] RuntimeError: missing: SHA512 digest for 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/build/smokeTestRelease/tmp/maven/org/apache/lucene/lucene-analyzers-common/8.0.0/lucene-analyzers-common-8.0.0-javadoc.jar

BUILD FAILED
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/build.xml:462:
 exec returned: 1

{noformat}

> Release .sha512 hash files with our artifacts
> -
>
> Key: LUCENE-7935
> URL: https://issues.apache.org/jira/browse/LUCENE-7935
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: general/build
>Reporter: Jan Høydahl
>Assignee: Jan Høydahl
>Priority: Major
> Fix For: 7.4, master (8.0)
>
> Attachments: LUCENE-7935.patch, LUCENE-7935.patch
>
>
> Currently we are only required to release {{.md5}} hashes with our artifacts, 
> and we also include {{.sha1}} files. It is expected that {{.sha512}} will be 
> required in the future (see 
> https://www.apache.org/dev/release-signing.html#sha1), so why not start 
> generating them right away?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



TestSSLRandomization is failing everytime

2018-04-04 Thread Joel Bernstein
TestSSLRandomization is failing 100% of the time. Anybody make changes
recently to this code?


Re: [JENKINS-MAVEN] Lucene-Solr-Maven-7.x #172: POMs out of sync

2018-04-04 Thread Chris Hostetter

IIUC: This looks like a maven/yetus dependency problem?



  [mvn] [INFO] Error for project: Apache Solr Core (during install)
  [mvn] [INFO] 

  [mvn] [INFO] Compilation failure
  [mvn] cannot access org.apache.yetus.audience.InterfaceAudience
  [mvn]   class file for org.apache.yetus.audience.InterfaceAudience not 
found
  [mvn] 
  [mvn] [INFO] 

  [mvn] [INFO] For more information, run Maven with the -e switch
  [mvn] [INFO] 

  [mvn] [INFO] BUILD ERRORS
  [mvn] [INFO] 

  [mvn] [INFO] Total time: 53 seconds
  [mvn] [INFO] Finished at: Wed Apr 04 00:54:42 UTC 2018
  [mvn] [INFO] Final Memory: 297M/1626M
  [mvn] [INFO] 


BUILD FAILED
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-Maven-7.x/build.xml:679: The 
following error occurred while executing this line:
: Java returned: 1



: Date: Wed, 4 Apr 2018 00:55:11 + (UTC)
: From: Apache Jenkins Server 
: Reply-To: dev@lucene.apache.org
: To: dev@lucene.apache.org
: Subject: [JENKINS-MAVEN] Lucene-Solr-Maven-7.x #172: POMs out of sync
: 
: Build: https://builds.apache.org/job/Lucene-Solr-Maven-7.x/172/
: 
: No tests ran.
: 
: Build Log:
: [...truncated 31626 lines...]
:   [mvn] [INFO] 
-
:   [mvn] [INFO] 
-
:   [mvn] [ERROR] COMPILATION ERROR : 
:   [mvn] [INFO] 
-
: 
: [...truncated 204 lines...]
: BUILD FAILED
: /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-Maven-7.x/build.xml:679: The 
following error occurred while executing this line:
: : Java returned: 1
: 
: Total time: 14 minutes 48 seconds
: Build step 'Invoke Ant' marked build as failure
: Email was triggered for: Failure - Any
: Sending email for trigger: Failure - Any
: 

-Hoss
http://www.lucidworks.com/

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-master-Solaris (64bit/jdk1.8.0) - Build # 1782 - Unstable!

2018-04-04 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Solaris/1782/
Java: 64bit/jdk1.8.0 -XX:-UseCompressedOops -XX:+UseSerialGC

4 tests failed.
FAILED:  
junit.framework.TestSuite.org.apache.solr.response.transform.TestSubQueryTransformerDistrib

Error Message:
java.util.concurrent.TimeoutException: Could not connect to ZooKeeper 
127.0.0.1:55276 within 3 ms

Stack Trace:
org.apache.solr.common.SolrException: java.util.concurrent.TimeoutException: 
Could not connect to ZooKeeper 127.0.0.1:55276 within 3 ms
at __randomizedtesting.SeedInfo.seed([6EF1D29694E21B6B]:0)
at 
org.apache.solr.common.cloud.SolrZkClient.(SolrZkClient.java:183)
at 
org.apache.solr.common.cloud.SolrZkClient.(SolrZkClient.java:120)
at 
org.apache.solr.common.cloud.SolrZkClient.(SolrZkClient.java:115)
at 
org.apache.solr.common.cloud.SolrZkClient.(SolrZkClient.java:102)
at 
org.apache.solr.cloud.MiniSolrCloudCluster.waitForAllNodes(MiniSolrCloudCluster.java:268)
at 
org.apache.solr.cloud.MiniSolrCloudCluster.(MiniSolrCloudCluster.java:262)
at 
org.apache.solr.cloud.SolrCloudTestCase$Builder.configure(SolrCloudTestCase.java:190)
at 
org.apache.solr.response.transform.TestSubQueryTransformerDistrib.setupCluster(TestSubQueryTransformerDistrib.java:63)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:874)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.TimeoutException: Could not connect to 
ZooKeeper 127.0.0.1:55276 within 3 ms
at 
org.apache.solr.common.cloud.ConnectionManager.waitForConnected(ConnectionManager.java:232)
at 
org.apache.solr.common.cloud.SolrZkClient.(SolrZkClient.java:175)
... 31 more


FAILED:  
junit.framework.TestSuite.org.apache.solr.response.transform.TestSubQueryTransformerDistrib

Error Message:
ObjectTracker found 2 object(s) that were not released!!! [InternalHttpClient, 
InternalHttpClient] 
org.apache.solr.common.util.ObjectReleaseTracker$ObjectTrackerException: 
org.apache.http.impl.client.InternalHttpClient  at 
org.apache.solr.common.util.ObjectReleaseTracker.track(ObjectReleaseTracker.java:42)
  at 
org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:289)
  at 
org.apache.solr.update.UpdateShardHandler.(UpdateShardHandler.java:94)  
at org.apache.solr.core.CoreContainer.load(CoreContainer.java:514)  at 
org.apache.solr.servlet.SolrDispatchFilter.createCoreContainer(SolrDispatchFilter.java:268)
  at 
org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:188)  
at 

[jira] [Commented] (SOLR-7896) Add a login page for Solr Administrative Interface

2018-04-04 Thread JIRA

[ 
https://issues.apache.org/jira/browse/SOLR-7896?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16425721#comment-16425721
 ] 

Jan Høydahl commented on SOLR-7896:
---

{quote}If you enable authentication (and require it for everything), running 
the admin UI actually does prompt for authentication. But it's not the UI 
*itself* that needs it – when it asks for username/password, it is actually 
requests to Solr's API (being made by your browser – not the Solr server) that 
are being authenticated.
{quote}
Your statement may be true for Basic Authentication since most browsers have 
ootb support for that scheme. But for Auth plugin X which may not even use 
username/passwd at all but some other scheme, your browser will simply display 
the 401 error message or some exception or whatever. And this will happen only 
once you click something in the UI that triggers a request to Solr, which is 
not a very good user experience. But since Solr allows for e.g. wide open 
search while admin or write requests require authentication, the UI should 
probably display the login box on demand whenever it gets a 401 from the server.

The HTTP 401 response when user tries to access a protected path will also 
include a {{WWW-Authenticate}} header which tells the client (AdminUI) what 
type of auth plugin is used. If we later on add support for more than one auth 
scheme at the same time, then Solr can output a list of supported ones:
{code:java}
WWW-Authenticate: Basic realm="solr"
WWW-Authenticate: Bearer realm="solr"
WWW-Authenticate: OAuth realm="solr"
{code}
I think the first phase of Admin UI login/auth support will be
 # Add a widget to the top/bottom of Admin UI screen that shows auth state, 
e.g.: {{User: George}}
 # Add interceptor for AJAX responses from Solr, identifying 
{{WWW-Autenticate}} header. If no header, just continue as before
 # Add parsing of WWW-Authenticate header: If header(s) exist, check whether 
Admin UI supports one of the auth schemes, if not display error message that 
Admin UI is not compatible with Auth XX, otherwise trigger login screen for 
given scheme
 # Implement login screen for Basic Auth (simple login form) along with an 
AngularJS request interceptor that adds the {{Authorization: Basic ...}} header 
on all requests
 # Implement caching of user credentials in the Webapp
 # Try to make it possible for Auth plugins to provide AdminUI login screens 
and request interceptor implementations, as some sort of HTML5 plugins living 
inside the jar file??

> Add a login page for Solr Administrative Interface
> --
>
> Key: SOLR-7896
> URL: https://issues.apache.org/jira/browse/SOLR-7896
> Project: Solr
>  Issue Type: New Feature
>  Components: Admin UI, security
>Affects Versions: 5.2.1
>Reporter: Aaron Greenspan
>Priority: Major
>  Labels: authentication, login, password
>
> Out of the box, the Solr Administrative interface should require a password 
> that the user is required to set.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-11929) UpdateLog metrics are not initialized on core reload

2018-04-04 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-11929?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16425702#comment-16425702
 ] 

ASF subversion and git services commented on SOLR-11929:


Commit 8e276b90f520df771d8a1e60408fe112c40ceea4 in lucene-solr's branch 
refs/heads/master from [~steve_rowe]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=8e276b9 ]

SOLR-11929: UpdateLog metrics are not initialized on core reload


> UpdateLog metrics are not initialized on core reload
> 
>
> Key: SOLR-11929
> URL: https://issues.apache.org/jira/browse/SOLR-11929
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: metrics
>Reporter: Steve Rowe
>Priority: Major
> Fix For: 7.4
>
> Attachments: SOLR-11929.patch, SOLR-11929.patch
>
>
> My Jenkins found a branch_7x seed for {{TestRecovery.testBuffering()}} and 
> {{TestRecovery.testCorruptLog()}} that reproduces for me 5/5 times (when I 
> exclude {{-Dtests.method=...}} from the cmdline):
> {noformat}
> Checking out Revision 1ef988a26378137b1e1f022985dacee1f557f4fc 
> (refs/remotes/origin/branch_7x)
> [...]
>[junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=TestRecovery 
> -Dtests.method=testBuffering -Dtests.seed=FC96FD26F8A8CC6F -Dtests.slow=true 
> -Dtests.locale=de-GR -Dtests.timezone=Europe/London -Dtests.asserts=true 
> -Dtests.file.encoding=UTF-8
>[junit4] FAILURE 0.02s J3  | TestRecovery.testBuffering <<<
>[junit4]> Throwable #1: java.lang.AssertionError: expected:<1> but 
> was:<3>
>[junit4]>  at 
> __randomizedtesting.SeedInfo.seed([FC96FD26F8A8CC6F:E178530D59F16D44]:0)
>[junit4]>  at 
> org.apache.solr.search.TestRecovery.testBuffering(TestRecovery.java:494)
>[junit4]>  at java.lang.Thread.run(Thread.java:748)
> [...]
>[junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=TestRecovery 
> -Dtests.method=testCorruptLog -Dtests.seed=FC96FD26F8A8CC6F -Dtests.slow=true 
> -Dtests.locale=de-GR -Dtests.timezone=Europe/London -Dtests.asserts=true 
> -Dtests.file.encoding=UTF-8
>[junit4] ERROR   0.35s J3  | TestRecovery.testCorruptLog <<<
>[junit4]> Throwable #1: java.lang.RuntimeException: mismatch: '3'!='0' 
> @ response/numFound
>[junit4]>  at 
> __randomizedtesting.SeedInfo.seed([FC96FD26F8A8CC6F:E4B49F502909DB3]:0)
>[junit4]>  at 
> org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:990)
>[junit4]>  at 
> org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:937)
>[junit4]>  at 
> org.apache.solr.search.TestRecovery.testCorruptLog(TestRecovery.java:1367)
>[junit4]>  at java.lang.Thread.run(Thread.java:748)
> [...]
>[junit4]   2> NOTE: test params are: codec=Asserting(Lucene70): 
> {_root_=PostingsFormat(name=LuceneVarGapFixedInterval), 
> id=PostingsFormat(name=Direct)}, 
> docValues:{_version_=DocValuesFormat(name=Lucene70), 
> val_i_dvo=DocValuesFormat(name=Memory), val_i=DocValuesFormat(name=Memory)}, 
> maxPointsInLeafNode=1937, maxMBSortInHeap=7.529691259992591, 
> sim=RandomSimilarity(queryNorm=false): {}, locale=de-GR, 
> timezone=Europe/London
>[junit4]   2> NOTE: Linux 4.1.0-custom2-amd64 amd64/Oracle Corporation 
> 1.8.0_151 (64-bit)/cpus=16,threads=1,free=217064096,total=530579456
> {noformat}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Resolved] (SOLR-11929) UpdateLog metrics are not initialized on core reload

2018-04-04 Thread Steve Rowe (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-11929?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Steve Rowe resolved SOLR-11929.
---
   Resolution: Fixed
 Assignee: Steve Rowe
Fix Version/s: 7.4

> UpdateLog metrics are not initialized on core reload
> 
>
> Key: SOLR-11929
> URL: https://issues.apache.org/jira/browse/SOLR-11929
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: metrics
>Reporter: Steve Rowe
>Assignee: Steve Rowe
>Priority: Major
> Fix For: 7.4
>
> Attachments: SOLR-11929.patch, SOLR-11929.patch
>
>
> My Jenkins found a branch_7x seed for {{TestRecovery.testBuffering()}} and 
> {{TestRecovery.testCorruptLog()}} that reproduces for me 5/5 times (when I 
> exclude {{-Dtests.method=...}} from the cmdline):
> {noformat}
> Checking out Revision 1ef988a26378137b1e1f022985dacee1f557f4fc 
> (refs/remotes/origin/branch_7x)
> [...]
>[junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=TestRecovery 
> -Dtests.method=testBuffering -Dtests.seed=FC96FD26F8A8CC6F -Dtests.slow=true 
> -Dtests.locale=de-GR -Dtests.timezone=Europe/London -Dtests.asserts=true 
> -Dtests.file.encoding=UTF-8
>[junit4] FAILURE 0.02s J3  | TestRecovery.testBuffering <<<
>[junit4]> Throwable #1: java.lang.AssertionError: expected:<1> but 
> was:<3>
>[junit4]>  at 
> __randomizedtesting.SeedInfo.seed([FC96FD26F8A8CC6F:E178530D59F16D44]:0)
>[junit4]>  at 
> org.apache.solr.search.TestRecovery.testBuffering(TestRecovery.java:494)
>[junit4]>  at java.lang.Thread.run(Thread.java:748)
> [...]
>[junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=TestRecovery 
> -Dtests.method=testCorruptLog -Dtests.seed=FC96FD26F8A8CC6F -Dtests.slow=true 
> -Dtests.locale=de-GR -Dtests.timezone=Europe/London -Dtests.asserts=true 
> -Dtests.file.encoding=UTF-8
>[junit4] ERROR   0.35s J3  | TestRecovery.testCorruptLog <<<
>[junit4]> Throwable #1: java.lang.RuntimeException: mismatch: '3'!='0' 
> @ response/numFound
>[junit4]>  at 
> __randomizedtesting.SeedInfo.seed([FC96FD26F8A8CC6F:E4B49F502909DB3]:0)
>[junit4]>  at 
> org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:990)
>[junit4]>  at 
> org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:937)
>[junit4]>  at 
> org.apache.solr.search.TestRecovery.testCorruptLog(TestRecovery.java:1367)
>[junit4]>  at java.lang.Thread.run(Thread.java:748)
> [...]
>[junit4]   2> NOTE: test params are: codec=Asserting(Lucene70): 
> {_root_=PostingsFormat(name=LuceneVarGapFixedInterval), 
> id=PostingsFormat(name=Direct)}, 
> docValues:{_version_=DocValuesFormat(name=Lucene70), 
> val_i_dvo=DocValuesFormat(name=Memory), val_i=DocValuesFormat(name=Memory)}, 
> maxPointsInLeafNode=1937, maxMBSortInHeap=7.529691259992591, 
> sim=RandomSimilarity(queryNorm=false): {}, locale=de-GR, 
> timezone=Europe/London
>[junit4]   2> NOTE: Linux 4.1.0-custom2-amd64 amd64/Oracle Corporation 
> 1.8.0_151 (64-bit)/cpus=16,threads=1,free=217064096,total=530579456
> {noformat}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-11929) UpdateLog metrics are not initialized on core reload

2018-04-04 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-11929?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16425701#comment-16425701
 ] 

ASF subversion and git services commented on SOLR-11929:


Commit 5a5802f5cb40811cd25232f3fe5127ecc80bdfc7 in lucene-solr's branch 
refs/heads/branch_7x from [~steve_rowe]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=5a5802f ]

SOLR-11929: UpdateLog metrics are not initialized on core reload


> UpdateLog metrics are not initialized on core reload
> 
>
> Key: SOLR-11929
> URL: https://issues.apache.org/jira/browse/SOLR-11929
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: metrics
>Reporter: Steve Rowe
>Priority: Major
> Fix For: 7.4
>
> Attachments: SOLR-11929.patch, SOLR-11929.patch
>
>
> My Jenkins found a branch_7x seed for {{TestRecovery.testBuffering()}} and 
> {{TestRecovery.testCorruptLog()}} that reproduces for me 5/5 times (when I 
> exclude {{-Dtests.method=...}} from the cmdline):
> {noformat}
> Checking out Revision 1ef988a26378137b1e1f022985dacee1f557f4fc 
> (refs/remotes/origin/branch_7x)
> [...]
>[junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=TestRecovery 
> -Dtests.method=testBuffering -Dtests.seed=FC96FD26F8A8CC6F -Dtests.slow=true 
> -Dtests.locale=de-GR -Dtests.timezone=Europe/London -Dtests.asserts=true 
> -Dtests.file.encoding=UTF-8
>[junit4] FAILURE 0.02s J3  | TestRecovery.testBuffering <<<
>[junit4]> Throwable #1: java.lang.AssertionError: expected:<1> but 
> was:<3>
>[junit4]>  at 
> __randomizedtesting.SeedInfo.seed([FC96FD26F8A8CC6F:E178530D59F16D44]:0)
>[junit4]>  at 
> org.apache.solr.search.TestRecovery.testBuffering(TestRecovery.java:494)
>[junit4]>  at java.lang.Thread.run(Thread.java:748)
> [...]
>[junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=TestRecovery 
> -Dtests.method=testCorruptLog -Dtests.seed=FC96FD26F8A8CC6F -Dtests.slow=true 
> -Dtests.locale=de-GR -Dtests.timezone=Europe/London -Dtests.asserts=true 
> -Dtests.file.encoding=UTF-8
>[junit4] ERROR   0.35s J3  | TestRecovery.testCorruptLog <<<
>[junit4]> Throwable #1: java.lang.RuntimeException: mismatch: '3'!='0' 
> @ response/numFound
>[junit4]>  at 
> __randomizedtesting.SeedInfo.seed([FC96FD26F8A8CC6F:E4B49F502909DB3]:0)
>[junit4]>  at 
> org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:990)
>[junit4]>  at 
> org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:937)
>[junit4]>  at 
> org.apache.solr.search.TestRecovery.testCorruptLog(TestRecovery.java:1367)
>[junit4]>  at java.lang.Thread.run(Thread.java:748)
> [...]
>[junit4]   2> NOTE: test params are: codec=Asserting(Lucene70): 
> {_root_=PostingsFormat(name=LuceneVarGapFixedInterval), 
> id=PostingsFormat(name=Direct)}, 
> docValues:{_version_=DocValuesFormat(name=Lucene70), 
> val_i_dvo=DocValuesFormat(name=Memory), val_i=DocValuesFormat(name=Memory)}, 
> maxPointsInLeafNode=1937, maxMBSortInHeap=7.529691259992591, 
> sim=RandomSimilarity(queryNorm=false): {}, locale=de-GR, 
> timezone=Europe/London
>[junit4]   2> NOTE: Linux 4.1.0-custom2-amd64 amd64/Oracle Corporation 
> 1.8.0_151 (64-bit)/cpus=16,threads=1,free=217064096,total=530579456
> {noformat}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8237) Add a SoftDeletesDirectoryReaderWrapper

2018-04-04 Thread Michael McCandless (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8237?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16425694#comment-16425694
 ] 

Michael McCandless commented on LUCENE-8237:


+1

> Add a SoftDeletesDirectoryReaderWrapper 
> 
>
> Key: LUCENE-8237
> URL: https://issues.apache.org/jira/browse/LUCENE-8237
> Project: Lucene - Core
>  Issue Type: Improvement
>Affects Versions: 7.4, master (8.0)
>Reporter: Simon Willnauer
>Priority: Major
> Fix For: 7.4, master (8.0)
>
> Attachments: LUCENE-8237.patch
>
>
> This adds support for soft deletes if the reader is opened form a directory.
> Today we only support soft deletes for NRT readers, this change allows to wrap
> existing DirectoryReader with a SoftDeletesDirectoryReaderWrapper to also 
> filter
> out soft deletes in the case of a non-NRT reader.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-11929) UpdateLog metrics are not initialized on core reload

2018-04-04 Thread Steve Rowe (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-11929?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16425688#comment-16425688
 ] 

Steve Rowe commented on SOLR-11929:
---

Thanks [~ab].

Attached a slightly reworked patch, to de-emphasize the "nothing to do here on 
core reload" message in comments/log msg, and also including a CHANGES entry.

Committing shortly.

> UpdateLog metrics are not initialized on core reload
> 
>
> Key: SOLR-11929
> URL: https://issues.apache.org/jira/browse/SOLR-11929
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: metrics
>Reporter: Steve Rowe
>Priority: Major
> Attachments: SOLR-11929.patch, SOLR-11929.patch
>
>
> My Jenkins found a branch_7x seed for {{TestRecovery.testBuffering()}} and 
> {{TestRecovery.testCorruptLog()}} that reproduces for me 5/5 times (when I 
> exclude {{-Dtests.method=...}} from the cmdline):
> {noformat}
> Checking out Revision 1ef988a26378137b1e1f022985dacee1f557f4fc 
> (refs/remotes/origin/branch_7x)
> [...]
>[junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=TestRecovery 
> -Dtests.method=testBuffering -Dtests.seed=FC96FD26F8A8CC6F -Dtests.slow=true 
> -Dtests.locale=de-GR -Dtests.timezone=Europe/London -Dtests.asserts=true 
> -Dtests.file.encoding=UTF-8
>[junit4] FAILURE 0.02s J3  | TestRecovery.testBuffering <<<
>[junit4]> Throwable #1: java.lang.AssertionError: expected:<1> but 
> was:<3>
>[junit4]>  at 
> __randomizedtesting.SeedInfo.seed([FC96FD26F8A8CC6F:E178530D59F16D44]:0)
>[junit4]>  at 
> org.apache.solr.search.TestRecovery.testBuffering(TestRecovery.java:494)
>[junit4]>  at java.lang.Thread.run(Thread.java:748)
> [...]
>[junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=TestRecovery 
> -Dtests.method=testCorruptLog -Dtests.seed=FC96FD26F8A8CC6F -Dtests.slow=true 
> -Dtests.locale=de-GR -Dtests.timezone=Europe/London -Dtests.asserts=true 
> -Dtests.file.encoding=UTF-8
>[junit4] ERROR   0.35s J3  | TestRecovery.testCorruptLog <<<
>[junit4]> Throwable #1: java.lang.RuntimeException: mismatch: '3'!='0' 
> @ response/numFound
>[junit4]>  at 
> __randomizedtesting.SeedInfo.seed([FC96FD26F8A8CC6F:E4B49F502909DB3]:0)
>[junit4]>  at 
> org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:990)
>[junit4]>  at 
> org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:937)
>[junit4]>  at 
> org.apache.solr.search.TestRecovery.testCorruptLog(TestRecovery.java:1367)
>[junit4]>  at java.lang.Thread.run(Thread.java:748)
> [...]
>[junit4]   2> NOTE: test params are: codec=Asserting(Lucene70): 
> {_root_=PostingsFormat(name=LuceneVarGapFixedInterval), 
> id=PostingsFormat(name=Direct)}, 
> docValues:{_version_=DocValuesFormat(name=Lucene70), 
> val_i_dvo=DocValuesFormat(name=Memory), val_i=DocValuesFormat(name=Memory)}, 
> maxPointsInLeafNode=1937, maxMBSortInHeap=7.529691259992591, 
> sim=RandomSimilarity(queryNorm=false): {}, locale=de-GR, 
> timezone=Europe/London
>[junit4]   2> NOTE: Linux 4.1.0-custom2-amd64 amd64/Oracle Corporation 
> 1.8.0_151 (64-bit)/cpus=16,threads=1,free=217064096,total=530579456
> {noformat}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-11929) UpdateLog metrics are not initialized on core reload

2018-04-04 Thread Steve Rowe (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-11929?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Steve Rowe updated SOLR-11929:
--
Attachment: SOLR-11929.patch

> UpdateLog metrics are not initialized on core reload
> 
>
> Key: SOLR-11929
> URL: https://issues.apache.org/jira/browse/SOLR-11929
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: metrics
>Reporter: Steve Rowe
>Priority: Major
> Attachments: SOLR-11929.patch, SOLR-11929.patch
>
>
> My Jenkins found a branch_7x seed for {{TestRecovery.testBuffering()}} and 
> {{TestRecovery.testCorruptLog()}} that reproduces for me 5/5 times (when I 
> exclude {{-Dtests.method=...}} from the cmdline):
> {noformat}
> Checking out Revision 1ef988a26378137b1e1f022985dacee1f557f4fc 
> (refs/remotes/origin/branch_7x)
> [...]
>[junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=TestRecovery 
> -Dtests.method=testBuffering -Dtests.seed=FC96FD26F8A8CC6F -Dtests.slow=true 
> -Dtests.locale=de-GR -Dtests.timezone=Europe/London -Dtests.asserts=true 
> -Dtests.file.encoding=UTF-8
>[junit4] FAILURE 0.02s J3  | TestRecovery.testBuffering <<<
>[junit4]> Throwable #1: java.lang.AssertionError: expected:<1> but 
> was:<3>
>[junit4]>  at 
> __randomizedtesting.SeedInfo.seed([FC96FD26F8A8CC6F:E178530D59F16D44]:0)
>[junit4]>  at 
> org.apache.solr.search.TestRecovery.testBuffering(TestRecovery.java:494)
>[junit4]>  at java.lang.Thread.run(Thread.java:748)
> [...]
>[junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=TestRecovery 
> -Dtests.method=testCorruptLog -Dtests.seed=FC96FD26F8A8CC6F -Dtests.slow=true 
> -Dtests.locale=de-GR -Dtests.timezone=Europe/London -Dtests.asserts=true 
> -Dtests.file.encoding=UTF-8
>[junit4] ERROR   0.35s J3  | TestRecovery.testCorruptLog <<<
>[junit4]> Throwable #1: java.lang.RuntimeException: mismatch: '3'!='0' 
> @ response/numFound
>[junit4]>  at 
> __randomizedtesting.SeedInfo.seed([FC96FD26F8A8CC6F:E4B49F502909DB3]:0)
>[junit4]>  at 
> org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:990)
>[junit4]>  at 
> org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:937)
>[junit4]>  at 
> org.apache.solr.search.TestRecovery.testCorruptLog(TestRecovery.java:1367)
>[junit4]>  at java.lang.Thread.run(Thread.java:748)
> [...]
>[junit4]   2> NOTE: test params are: codec=Asserting(Lucene70): 
> {_root_=PostingsFormat(name=LuceneVarGapFixedInterval), 
> id=PostingsFormat(name=Direct)}, 
> docValues:{_version_=DocValuesFormat(name=Lucene70), 
> val_i_dvo=DocValuesFormat(name=Memory), val_i=DocValuesFormat(name=Memory)}, 
> maxPointsInLeafNode=1937, maxMBSortInHeap=7.529691259992591, 
> sim=RandomSimilarity(queryNorm=false): {}, locale=de-GR, 
> timezone=Europe/London
>[junit4]   2> NOTE: Linux 4.1.0-custom2-amd64 amd64/Oracle Corporation 
> 1.8.0_151 (64-bit)/cpus=16,threads=1,free=217064096,total=530579456
> {noformat}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12095) AutoScalingHandler should validate triggers before updating zookeeper

2018-04-04 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-12095?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16425677#comment-16425677
 ] 

ASF subversion and git services commented on SOLR-12095:


Commit 12309659f603f2bbbe353a459eb916adde0c8a02 in lucene-solr's branch 
refs/heads/branch_7x from [~ab]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=1230965 ]

SOLR-12095: Missed a few calls to init().


> AutoScalingHandler should validate triggers before updating zookeeper
> -
>
> Key: SOLR-12095
> URL: https://issues.apache.org/jira/browse/SOLR-12095
> Project: Solr
>  Issue Type: Sub-task
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: AutoScaling, SolrCloud
>Reporter: Shalin Shekhar Mangar
>Assignee: Andrzej Bialecki 
>Priority: Major
> Fix For: 7.4, master (8.0)
>
>
> We validate policy and preferences before updating the configuration in 
> Zookeeper but we don't do that today for triggers. So users can put wrong or 
> unknown parameters and there won't be any complains from the API but the at 
> runtime exceptions will be thrown/logged.
> We should change the trigger API to have a validation step. The catch here is 
> that it may require us to instantiate the trigger class.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12095) AutoScalingHandler should validate triggers before updating zookeeper

2018-04-04 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-12095?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16425675#comment-16425675
 ] 

ASF subversion and git services commented on SOLR-12095:


Commit 2bbd19369137d2b31f44c94ce2de61f9047856f4 in lucene-solr's branch 
refs/heads/master from [~ab]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=2bbd193 ]

SOLR-12095: Missed a few calls to init().


> AutoScalingHandler should validate triggers before updating zookeeper
> -
>
> Key: SOLR-12095
> URL: https://issues.apache.org/jira/browse/SOLR-12095
> Project: Solr
>  Issue Type: Sub-task
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: AutoScaling, SolrCloud
>Reporter: Shalin Shekhar Mangar
>Assignee: Andrzej Bialecki 
>Priority: Major
> Fix For: 7.4, master (8.0)
>
>
> We validate policy and preferences before updating the configuration in 
> Zookeeper but we don't do that today for triggers. So users can put wrong or 
> unknown parameters and there won't be any complains from the API but the at 
> runtime exceptions will be thrown/logged.
> We should change the trigger API to have a validation step. The catch here is 
> that it may require us to instantiate the trigger class.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-11929) UpdateLog metrics are not initialized on core reload

2018-04-04 Thread Steve Rowe (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-11929?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Steve Rowe updated SOLR-11929:
--
Component/s: metrics

> UpdateLog metrics are not initialized on core reload
> 
>
> Key: SOLR-11929
> URL: https://issues.apache.org/jira/browse/SOLR-11929
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: metrics
>Reporter: Steve Rowe
>Priority: Major
> Attachments: SOLR-11929.patch
>
>
> My Jenkins found a branch_7x seed for {{TestRecovery.testBuffering()}} and 
> {{TestRecovery.testCorruptLog()}} that reproduces for me 5/5 times (when I 
> exclude {{-Dtests.method=...}} from the cmdline):
> {noformat}
> Checking out Revision 1ef988a26378137b1e1f022985dacee1f557f4fc 
> (refs/remotes/origin/branch_7x)
> [...]
>[junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=TestRecovery 
> -Dtests.method=testBuffering -Dtests.seed=FC96FD26F8A8CC6F -Dtests.slow=true 
> -Dtests.locale=de-GR -Dtests.timezone=Europe/London -Dtests.asserts=true 
> -Dtests.file.encoding=UTF-8
>[junit4] FAILURE 0.02s J3  | TestRecovery.testBuffering <<<
>[junit4]> Throwable #1: java.lang.AssertionError: expected:<1> but 
> was:<3>
>[junit4]>  at 
> __randomizedtesting.SeedInfo.seed([FC96FD26F8A8CC6F:E178530D59F16D44]:0)
>[junit4]>  at 
> org.apache.solr.search.TestRecovery.testBuffering(TestRecovery.java:494)
>[junit4]>  at java.lang.Thread.run(Thread.java:748)
> [...]
>[junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=TestRecovery 
> -Dtests.method=testCorruptLog -Dtests.seed=FC96FD26F8A8CC6F -Dtests.slow=true 
> -Dtests.locale=de-GR -Dtests.timezone=Europe/London -Dtests.asserts=true 
> -Dtests.file.encoding=UTF-8
>[junit4] ERROR   0.35s J3  | TestRecovery.testCorruptLog <<<
>[junit4]> Throwable #1: java.lang.RuntimeException: mismatch: '3'!='0' 
> @ response/numFound
>[junit4]>  at 
> __randomizedtesting.SeedInfo.seed([FC96FD26F8A8CC6F:E4B49F502909DB3]:0)
>[junit4]>  at 
> org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:990)
>[junit4]>  at 
> org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:937)
>[junit4]>  at 
> org.apache.solr.search.TestRecovery.testCorruptLog(TestRecovery.java:1367)
>[junit4]>  at java.lang.Thread.run(Thread.java:748)
> [...]
>[junit4]   2> NOTE: test params are: codec=Asserting(Lucene70): 
> {_root_=PostingsFormat(name=LuceneVarGapFixedInterval), 
> id=PostingsFormat(name=Direct)}, 
> docValues:{_version_=DocValuesFormat(name=Lucene70), 
> val_i_dvo=DocValuesFormat(name=Memory), val_i=DocValuesFormat(name=Memory)}, 
> maxPointsInLeafNode=1937, maxMBSortInHeap=7.529691259992591, 
> sim=RandomSimilarity(queryNorm=false): {}, locale=de-GR, 
> timezone=Europe/London
>[junit4]   2> NOTE: Linux 4.1.0-custom2-amd64 amd64/Oracle Corporation 
> 1.8.0_151 (64-bit)/cpus=16,threads=1,free=217064096,total=530579456
> {noformat}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-11929) UpdateLog metrics are not initialized on core reload

2018-04-04 Thread Steve Rowe (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-11929?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Steve Rowe updated SOLR-11929:
--
Summary: UpdateLog metrics are not initialized on core reload  (was: 
TestRecovery failures)

> UpdateLog metrics are not initialized on core reload
> 
>
> Key: SOLR-11929
> URL: https://issues.apache.org/jira/browse/SOLR-11929
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Steve Rowe
>Priority: Major
> Attachments: SOLR-11929.patch
>
>
> My Jenkins found a branch_7x seed for {{TestRecovery.testBuffering()}} and 
> {{TestRecovery.testCorruptLog()}} that reproduces for me 5/5 times (when I 
> exclude {{-Dtests.method=...}} from the cmdline):
> {noformat}
> Checking out Revision 1ef988a26378137b1e1f022985dacee1f557f4fc 
> (refs/remotes/origin/branch_7x)
> [...]
>[junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=TestRecovery 
> -Dtests.method=testBuffering -Dtests.seed=FC96FD26F8A8CC6F -Dtests.slow=true 
> -Dtests.locale=de-GR -Dtests.timezone=Europe/London -Dtests.asserts=true 
> -Dtests.file.encoding=UTF-8
>[junit4] FAILURE 0.02s J3  | TestRecovery.testBuffering <<<
>[junit4]> Throwable #1: java.lang.AssertionError: expected:<1> but 
> was:<3>
>[junit4]>  at 
> __randomizedtesting.SeedInfo.seed([FC96FD26F8A8CC6F:E178530D59F16D44]:0)
>[junit4]>  at 
> org.apache.solr.search.TestRecovery.testBuffering(TestRecovery.java:494)
>[junit4]>  at java.lang.Thread.run(Thread.java:748)
> [...]
>[junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=TestRecovery 
> -Dtests.method=testCorruptLog -Dtests.seed=FC96FD26F8A8CC6F -Dtests.slow=true 
> -Dtests.locale=de-GR -Dtests.timezone=Europe/London -Dtests.asserts=true 
> -Dtests.file.encoding=UTF-8
>[junit4] ERROR   0.35s J3  | TestRecovery.testCorruptLog <<<
>[junit4]> Throwable #1: java.lang.RuntimeException: mismatch: '3'!='0' 
> @ response/numFound
>[junit4]>  at 
> __randomizedtesting.SeedInfo.seed([FC96FD26F8A8CC6F:E4B49F502909DB3]:0)
>[junit4]>  at 
> org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:990)
>[junit4]>  at 
> org.apache.solr.SolrTestCaseJ4.assertJQ(SolrTestCaseJ4.java:937)
>[junit4]>  at 
> org.apache.solr.search.TestRecovery.testCorruptLog(TestRecovery.java:1367)
>[junit4]>  at java.lang.Thread.run(Thread.java:748)
> [...]
>[junit4]   2> NOTE: test params are: codec=Asserting(Lucene70): 
> {_root_=PostingsFormat(name=LuceneVarGapFixedInterval), 
> id=PostingsFormat(name=Direct)}, 
> docValues:{_version_=DocValuesFormat(name=Lucene70), 
> val_i_dvo=DocValuesFormat(name=Memory), val_i=DocValuesFormat(name=Memory)}, 
> maxPointsInLeafNode=1937, maxMBSortInHeap=7.529691259992591, 
> sim=RandomSimilarity(queryNorm=false): {}, locale=de-GR, 
> timezone=Europe/London
>[junit4]   2> NOTE: Linux 4.1.0-custom2-amd64 amd64/Oracle Corporation 
> 1.8.0_151 (64-bit)/cpus=16,threads=1,free=217064096,total=530579456
> {noformat}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-2899) Add OpenNLP Analysis capabilities as a module

2018-04-04 Thread Alexey Ponomarenko (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-2899?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16425667#comment-16425667
 ] 

Alexey Ponomarenko commented on LUCENE-2899:


Thanks, I saw this but I will try. But the point is that I am using windows and 
I am trying to use _solr zk_ command but as far as I see this is wrong 
direction. 

Anyway I will try _server/scripts/cloud-scripts/zkcli.bat ._

Thanks a lot for pointing me into right direction. 

> Add OpenNLP Analysis capabilities as a module
> -
>
> Key: LUCENE-2899
> URL: https://issues.apache.org/jira/browse/LUCENE-2899
> Project: Lucene - Core
>  Issue Type: New Feature
>  Components: modules/analysis
>Reporter: Grant Ingersoll
>Assignee: Steve Rowe
>Priority: Minor
> Fix For: 7.3, master (8.0)
>
> Attachments: LUCENE-2899-6.1.0.patch, LUCENE-2899-RJN.patch, 
> LUCENE-2899.patch, LUCENE-2899.patch, LUCENE-2899.patch, LUCENE-2899.patch, 
> LUCENE-2899.patch, LUCENE-2899.patch, OpenNLPFilter.java, 
> OpenNLPTokenizer.java
>
>
> Now that OpenNLP is an ASF project and has a nice license, it would be nice 
> to have a submodule (under analysis) that exposed capabilities for it. Drew 
> Farris, Tom Morton and I have code that does:
> * Sentence Detection as a Tokenizer (could also be a TokenFilter, although it 
> would have to change slightly to buffer tokens)
> * NamedEntity recognition as a TokenFilter
> We are also planning a Tokenizer/TokenFilter that can put parts of speech as 
> either payloads (PartOfSpeechAttribute?) on a token or at the same position.
> I'd propose it go under:
> modules/analysis/opennlp



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: VOTE: Apache Solr Reference Guide for Solr 7.3 RC1

2018-04-04 Thread Cassandra Targett
Thanks everyone, this vote has passed. I'll start the release process this
afternoon.

On Tue, Apr 3, 2018 at 7:00 PM, Anshum Gupta  wrote:

> +1
>
> On Tue, Apr 3, 2018 at 2:25 PM Tomas Fernandez Lobbe 
> wrote:
>
>> +1
>>
>>
>> On Apr 3, 2018, at 12:45 PM, Varun Thacker  wrote:
>>
>> +1
>>
>> On Tue, Apr 3, 2018 at 10:47 AM, Steve Rowe  wrote:
>>
>>> +1
>>>
>>> --
>>> Steve
>>> www.lucidworks.com
>>>
>>> > On Apr 3, 2018, at 10:06 AM, Mikhail Khludnev  wrote:
>>> >
>>> > I've looked through recent changes in PDF. It seems good.
>>> >
>>> > On Tue, Apr 3, 2018 at 4:32 PM, Cassandra Targett <
>>> casstarg...@gmail.com> wrote:
>>> > Reminder about this.
>>> >
>>> > It looks like the Lucene/Solr release vote is going to pass, so we
>>> could have both released at about the same time.
>>> >
>>> > Thanks,
>>> > Cassandra
>>> >
>>> > On Thu, Mar 29, 2018 at 10:49 AM, Cassandra Targett <
>>> casstarg...@gmail.com> wrote:
>>> > Please vote to release the Apache Solr Reference Guide for Solr 7.3.
>>> >
>>> > The artifacts can be downloaded from:
>>> > https://dist.apache.org/repos/dist/dev/lucene/solr/ref-
>>> guide/apache-solr-ref-guide-7.3-RC1/
>>> >
>>> > $ cat apache-solr-ref-guide-7.3.pdf.sha1
>>> > 151f06d920d1ac41564f3c0ddabae3c2c36b6892
>>> apache-solr-ref-guide-7.3.pdf
>>> >
>>> > The HTML version has also been uploaded to the website:
>>> > https://lucene.apache.org/solr/guide/7_3/
>>> >
>>> > Here's my +1.
>>> >
>>> > If it happens that this vote passes before the vote for the final
>>> Lucene/Solr RC is complete, I'll hold release/announcement of the Ref Guide
>>> until the vote is complete and the release steps are finished.
>>> >
>>> > Thanks,
>>> > Cassandra
>>> >
>>> >
>>> >
>>> >
>>> > --
>>> > Sincerely yours
>>> > Mikhail Khludnev
>>>
>>>
>>> -
>>> To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
>>> For additional commands, e-mail: dev-h...@lucene.apache.org
>>>
>>>
>>
>>


[jira] [Comment Edited] (LUCENE-2899) Add OpenNLP Analysis capabilities as a module

2018-04-04 Thread Steve Rowe (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-2899?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16425628#comment-16425628
 ] 

Steve Rowe edited comment on LUCENE-2899 at 4/4/18 2:45 PM:


bq. how can I upload model files to the Zookeeper?

Have you seen 
[https://lucene.apache.org/solr/guide/7_3/using-zookeeper-to-manage-configuration-files.html]
 and [https://lucene.apache.org/solr/guide/7_3/command-line-utilities.html] ?  
In particular, from 
[https://lucene.apache.org/solr/guide/7_3/command-line-utilities.html#put-a-local-file-into-a-new-zookeeper-file]:

{noformat}
./server/scripts/cloud-scripts/zkcli.sh -zkhost 127.0.0.1:9983 -cmd putfile 
/my_zk_file.txt /tmp/my_local_file.txt
{noformat}


was (Author: steve_rowe):
bq.

Have you seen 
[https://lucene.apache.org/solr/guide/7_3/using-zookeeper-to-manage-configuration-files.html]
 and [https://lucene.apache.org/solr/guide/7_3/command-line-utilities.html] ?  
In particular, from 
[https://lucene.apache.org/solr/guide/7_3/command-line-utilities.html#put-a-local-file-into-a-new-zookeeper-file]:

{noformat}
./server/scripts/cloud-scripts/zkcli.sh -zkhost 127.0.0.1:9983 -cmd putfile 
/my_zk_file.txt /tmp/my_local_file.txt
{noformat}

> Add OpenNLP Analysis capabilities as a module
> -
>
> Key: LUCENE-2899
> URL: https://issues.apache.org/jira/browse/LUCENE-2899
> Project: Lucene - Core
>  Issue Type: New Feature
>  Components: modules/analysis
>Reporter: Grant Ingersoll
>Assignee: Steve Rowe
>Priority: Minor
> Fix For: 7.3, master (8.0)
>
> Attachments: LUCENE-2899-6.1.0.patch, LUCENE-2899-RJN.patch, 
> LUCENE-2899.patch, LUCENE-2899.patch, LUCENE-2899.patch, LUCENE-2899.patch, 
> LUCENE-2899.patch, LUCENE-2899.patch, OpenNLPFilter.java, 
> OpenNLPTokenizer.java
>
>
> Now that OpenNLP is an ASF project and has a nice license, it would be nice 
> to have a submodule (under analysis) that exposed capabilities for it. Drew 
> Farris, Tom Morton and I have code that does:
> * Sentence Detection as a Tokenizer (could also be a TokenFilter, although it 
> would have to change slightly to buffer tokens)
> * NamedEntity recognition as a TokenFilter
> We are also planning a Tokenizer/TokenFilter that can put parts of speech as 
> either payloads (PartOfSpeechAttribute?) on a token or at the same position.
> I'd propose it go under:
> modules/analysis/opennlp



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-2899) Add OpenNLP Analysis capabilities as a module

2018-04-04 Thread Steve Rowe (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-2899?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16425628#comment-16425628
 ] 

Steve Rowe commented on LUCENE-2899:


bq.

Have you seen 
[https://lucene.apache.org/solr/guide/7_3/using-zookeeper-to-manage-configuration-files.html]
 and [https://lucene.apache.org/solr/guide/7_3/command-line-utilities.html] ?  
In particular, from 
[https://lucene.apache.org/solr/guide/7_3/command-line-utilities.html#put-a-local-file-into-a-new-zookeeper-file]:

{noformat}
./server/scripts/cloud-scripts/zkcli.sh -zkhost 127.0.0.1:9983 -cmd putfile 
/my_zk_file.txt /tmp/my_local_file.txt
{noformat}

> Add OpenNLP Analysis capabilities as a module
> -
>
> Key: LUCENE-2899
> URL: https://issues.apache.org/jira/browse/LUCENE-2899
> Project: Lucene - Core
>  Issue Type: New Feature
>  Components: modules/analysis
>Reporter: Grant Ingersoll
>Assignee: Steve Rowe
>Priority: Minor
> Fix For: 7.3, master (8.0)
>
> Attachments: LUCENE-2899-6.1.0.patch, LUCENE-2899-RJN.patch, 
> LUCENE-2899.patch, LUCENE-2899.patch, LUCENE-2899.patch, LUCENE-2899.patch, 
> LUCENE-2899.patch, LUCENE-2899.patch, OpenNLPFilter.java, 
> OpenNLPTokenizer.java
>
>
> Now that OpenNLP is an ASF project and has a nice license, it would be nice 
> to have a submodule (under analysis) that exposed capabilities for it. Drew 
> Farris, Tom Morton and I have code that does:
> * Sentence Detection as a Tokenizer (could also be a TokenFilter, although it 
> would have to change slightly to buffer tokens)
> * NamedEntity recognition as a TokenFilter
> We are also planning a Tokenizer/TokenFilter that can put parts of speech as 
> either payloads (PartOfSpeechAttribute?) on a token or at the same position.
> I'd propose it go under:
> modules/analysis/opennlp



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (LUCENE-2899) Add OpenNLP Analysis capabilities as a module

2018-04-04 Thread Alexey Ponomarenko (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-2899?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16425619#comment-16425619
 ] 

Alexey Ponomarenko edited comment on LUCENE-2899 at 4/4/18 2:37 PM:


Thanks, *but how can I upload model files to the Zookeeper*?

I have asked similar question to SO but nobody can answer me 
[https://stackoverflow.com/questions/49515397/upload-filebinary-into-zookeeper-solrcloud]

Without uploading model files I can not use OpenNLP this is a crucial point in 
installation. 


was (Author: fatalityap):
Thanks, *but how can I unload model files to the Zookeeper*?

I have asked similar question to SO but nobody can answer me 
[https://stackoverflow.com/questions/49515397/upload-filebinary-into-zookeeper-solrcloud]

Without uploading model files I can not use OpenNLP this is a crucial point in 
installation. 

> Add OpenNLP Analysis capabilities as a module
> -
>
> Key: LUCENE-2899
> URL: https://issues.apache.org/jira/browse/LUCENE-2899
> Project: Lucene - Core
>  Issue Type: New Feature
>  Components: modules/analysis
>Reporter: Grant Ingersoll
>Assignee: Steve Rowe
>Priority: Minor
> Fix For: 7.3, master (8.0)
>
> Attachments: LUCENE-2899-6.1.0.patch, LUCENE-2899-RJN.patch, 
> LUCENE-2899.patch, LUCENE-2899.patch, LUCENE-2899.patch, LUCENE-2899.patch, 
> LUCENE-2899.patch, LUCENE-2899.patch, OpenNLPFilter.java, 
> OpenNLPTokenizer.java
>
>
> Now that OpenNLP is an ASF project and has a nice license, it would be nice 
> to have a submodule (under analysis) that exposed capabilities for it. Drew 
> Farris, Tom Morton and I have code that does:
> * Sentence Detection as a Tokenizer (could also be a TokenFilter, although it 
> would have to change slightly to buffer tokens)
> * NamedEntity recognition as a TokenFilter
> We are also planning a Tokenizer/TokenFilter that can put parts of speech as 
> either payloads (PartOfSpeechAttribute?) on a token or at the same position.
> I'd propose it go under:
> modules/analysis/opennlp



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8237) Add a SoftDeletesDirectoryReaderWrapper

2018-04-04 Thread Simon Willnauer (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8237?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16425625#comment-16425625
 ] 

Simon Willnauer commented on LUCENE-8237:
-

https://github.com/s1monw/lucene-solr/pull/9 here is a review PR

> Add a SoftDeletesDirectoryReaderWrapper 
> 
>
> Key: LUCENE-8237
> URL: https://issues.apache.org/jira/browse/LUCENE-8237
> Project: Lucene - Core
>  Issue Type: Improvement
>Affects Versions: 7.4, master (8.0)
>Reporter: Simon Willnauer
>Priority: Major
> Fix For: 7.4, master (8.0)
>
> Attachments: LUCENE-8237.patch
>
>
> This adds support for soft deletes if the reader is opened form a directory.
> Today we only support soft deletes for NRT readers, this change allows to wrap
> existing DirectoryReader with a SoftDeletesDirectoryReaderWrapper to also 
> filter
> out soft deletes in the case of a non-NRT reader.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (LUCENE-8237) Add a SoftDeletesDirectoryReaderWrapper

2018-04-04 Thread Simon Willnauer (JIRA)

 [ 
https://issues.apache.org/jira/browse/LUCENE-8237?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Simon Willnauer updated LUCENE-8237:

Attachment: LUCENE-8237.patch

> Add a SoftDeletesDirectoryReaderWrapper 
> 
>
> Key: LUCENE-8237
> URL: https://issues.apache.org/jira/browse/LUCENE-8237
> Project: Lucene - Core
>  Issue Type: Improvement
>Affects Versions: 7.4, master (8.0)
>Reporter: Simon Willnauer
>Priority: Major
> Fix For: 7.4, master (8.0)
>
> Attachments: LUCENE-8237.patch
>
>
> This adds support for soft deletes if the reader is opened form a directory.
> Today we only support soft deletes for NRT readers, this change allows to wrap
> existing DirectoryReader with a SoftDeletesDirectoryReaderWrapper to also 
> filter
> out soft deletes in the case of a non-NRT reader.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Created] (LUCENE-8237) Add a SoftDeletesDirectoryReaderWrapper

2018-04-04 Thread Simon Willnauer (JIRA)
Simon Willnauer created LUCENE-8237:
---

 Summary: Add a SoftDeletesDirectoryReaderWrapper 
 Key: LUCENE-8237
 URL: https://issues.apache.org/jira/browse/LUCENE-8237
 Project: Lucene - Core
  Issue Type: Improvement
Affects Versions: 7.4, master (8.0)
Reporter: Simon Willnauer
 Fix For: 7.4, master (8.0)


This adds support for soft deletes if the reader is opened form a directory.
Today we only support soft deletes for NRT readers, this change allows to wrap
existing DirectoryReader with a SoftDeletesDirectoryReaderWrapper to also filter
out soft deletes in the case of a non-NRT reader.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-2899) Add OpenNLP Analysis capabilities as a module

2018-04-04 Thread Alexey Ponomarenko (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-2899?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16425619#comment-16425619
 ] 

Alexey Ponomarenko commented on LUCENE-2899:


Thanks, *but how can I unload model files to the Zookeeper*?

I have asked similar question to SO but nobody can answer me 
[https://stackoverflow.com/questions/49515397/upload-filebinary-into-zookeeper-solrcloud]

Without uploading model files I can not use OpenNLP this is a crucial point in 
installation. 

> Add OpenNLP Analysis capabilities as a module
> -
>
> Key: LUCENE-2899
> URL: https://issues.apache.org/jira/browse/LUCENE-2899
> Project: Lucene - Core
>  Issue Type: New Feature
>  Components: modules/analysis
>Reporter: Grant Ingersoll
>Assignee: Steve Rowe
>Priority: Minor
> Fix For: 7.3, master (8.0)
>
> Attachments: LUCENE-2899-6.1.0.patch, LUCENE-2899-RJN.patch, 
> LUCENE-2899.patch, LUCENE-2899.patch, LUCENE-2899.patch, LUCENE-2899.patch, 
> LUCENE-2899.patch, LUCENE-2899.patch, OpenNLPFilter.java, 
> OpenNLPTokenizer.java
>
>
> Now that OpenNLP is an ASF project and has a nice license, it would be nice 
> to have a submodule (under analysis) that exposed capabilities for it. Drew 
> Farris, Tom Morton and I have code that does:
> * Sentence Detection as a Tokenizer (could also be a TokenFilter, although it 
> would have to change slightly to buffer tokens)
> * NamedEntity recognition as a TokenFilter
> We are also planning a Tokenizer/TokenFilter that can put parts of speech as 
> either payloads (PartOfSpeechAttribute?) on a token or at the same position.
> I'd propose it go under:
> modules/analysis/opennlp



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8189) Build fails with ant version 1.10.x

2018-04-04 Thread Michael Braun (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8189?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16425588#comment-16425588
 ] 

Michael Braun commented on LUCENE-8189:
---

Ant 1.10.3 has been released which includes the fix.

> Build fails with ant version 1.10.x
> ---
>
> Key: LUCENE-8189
> URL: https://issues.apache.org/jira/browse/LUCENE-8189
> Project: Lucene - Core
>  Issue Type: Bug
>  Components: general/build
>Affects Versions: master (8.0)
>Reporter: Shawn Heisey
>Priority: Minor
>
> Any action I try to take with ANT_HOME set to the 1.10.2 version fails with a 
> NullPointerException.  If I revert back to ANT_HOME pointing at 1.9, 
> everything's fine.
> {noformat}
> C:\Users\sheisey\git\lucene-solr>ant clean
> Buildfile: C:\Users\sheisey\git\lucene-solr\build.xml
> BUILD FAILED
> C:\Users\sheisey\git\lucene-solr\build.xml:21: The following error occurred 
> while executing this line:
> C:\Users\sheisey\git\lucene-solr\lucene\common-build.xml:623: 
> java.lang.NullPointerException
> at java.util.Arrays.stream(Arrays.java:5004)
> at java.util.stream.Stream.of(Stream.java:1000)
> at 
> java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:267)
> at 
> java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
> at 
> java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
> at 
> java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
> at 
> java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
> at 
> java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:545)
> at 
> java.util.stream.AbstractPipeline.evaluateToArrayNode(AbstractPipeline.java:260)
> at 
> java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:438)
> at 
> org.apache.tools.ant.util.ChainedMapper.lambda$mapFileName$1(ChainedMapper.java:36)
> at java.util.stream.ReduceOps$1ReducingSink.accept(ReduceOps.java:80)
> at 
> java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1374)
> at 
> java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
> at 
> java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
> at 
> java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
> at 
> java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
> at 
> java.util.stream.ReferencePipeline.reduce(ReferencePipeline.java:484)
> at 
> org.apache.tools.ant.util.ChainedMapper.mapFileName(ChainedMapper.java:35)
> at 
> org.apache.tools.ant.util.CompositeMapper.lambda$mapFileName$0(CompositeMapper.java:32)
> at 
> java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
> at 
> java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
> at 
> java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1374)
> at 
> java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
> at 
> java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
> at 
> java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:545)
> at 
> java.util.stream.AbstractPipeline.evaluateToArrayNode(AbstractPipeline.java:260)
> at 
> java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:438)
> at 
> org.apache.tools.ant.util.CompositeMapper.mapFileName(CompositeMapper.java:33)
> at 
> org.apache.tools.ant.taskdefs.PathConvert.execute(PathConvert.java:363)
> at 
> org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:292)
> at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at 
> org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
> at org.apache.tools.ant.Task.perform(Task.java:346)
> at org.apache.tools.ant.Target.execute(Target.java:448)
> at 
> org.apache.tools.ant.helper.ProjectHelper2.parse(ProjectHelper2.java:172)
> at 
> org.apache.tools.ant.taskdefs.ImportTask.importResource(ImportTask.java:221)
> at 
> org.apache.tools.ant.taskdefs.ImportTask.execute(ImportTask.java:165)
> at 
> org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:292)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at 
> 

[jira] [Commented] (SOLR-12155) Solr 7.2.1 deadlock in UnInvertedField.getUnInvertedField()

2018-04-04 Thread Mikhail Khludnev (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-12155?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16425566#comment-16425566
 ] 

Mikhail Khludnev commented on SOLR-12155:
-

[^SOLR-12155.patch] seems like reproducer to me. 

> Solr 7.2.1 deadlock in UnInvertedField.getUnInvertedField() 
> 
>
> Key: SOLR-12155
> URL: https://issues.apache.org/jira/browse/SOLR-12155
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>Affects Versions: 7.2.1
>Reporter: Kishor gandham
>Priority: Major
> Attachments: SOLR-12155.patch, stack.txt
>
>
> I am attaching a stack trace from our production Solr (7.2.1). Occasionally, 
> we are seeing SOLR becoming unresponsive. We are then forced to kill the JVM 
> and start solr again.
> We have a lot of facet queries and our index has approximately 15 million 
> documents. We have recently started using json.facet queries and some of the 
> facet fields use DocValues.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-12155) Solr 7.2.1 deadlock in UnInvertedField.getUnInvertedField()

2018-04-04 Thread Mikhail Khludnev (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-12155?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mikhail Khludnev updated SOLR-12155:

Attachment: SOLR-12155.patch

> Solr 7.2.1 deadlock in UnInvertedField.getUnInvertedField() 
> 
>
> Key: SOLR-12155
> URL: https://issues.apache.org/jira/browse/SOLR-12155
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>Affects Versions: 7.2.1
>Reporter: Kishor gandham
>Priority: Major
> Attachments: SOLR-12155.patch, stack.txt
>
>
> I am attaching a stack trace from our production Solr (7.2.1). Occasionally, 
> we are seeing SOLR becoming unresponsive. We are then forced to kill the JVM 
> and start solr again.
> We have a lot of facet queries and our index has approximately 15 million 
> documents. We have recently started using json.facet queries and some of the 
> facet fields use DocValues.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-BadApples-Tests-master - Build # 28 - Unstable

2018-04-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-BadApples-Tests-master/28/

8 tests failed.
FAILED:  
junit.framework.TestSuite.org.apache.solr.cloud.TestLeaderElectionZkExpiry

Error Message:
ObjectTracker found 1 object(s) that were not released!!! [Overseer] 
org.apache.solr.common.util.ObjectReleaseTracker$ObjectTrackerException: 
org.apache.solr.cloud.Overseer  at 
org.apache.solr.common.util.ObjectReleaseTracker.track(ObjectReleaseTracker.java:42)
  at org.apache.solr.cloud.Overseer.start(Overseer.java:545)  at 
org.apache.solr.cloud.OverseerElectionContext.runLeaderProcess(ElectionContext.java:850)
  at 
org.apache.solr.cloud.LeaderElector.runIamLeaderProcess(LeaderElector.java:170) 
 at 
org.apache.solr.cloud.LeaderElector.checkIfIamLeader(LeaderElector.java:135)  
at org.apache.solr.cloud.LeaderElector.joinElection(LeaderElector.java:307)  at 
org.apache.solr.cloud.LeaderElector.joinElection(LeaderElector.java:216)  at 
org.apache.solr.cloud.ZkController$1.command(ZkController.java:355)  at 
org.apache.solr.common.cloud.ConnectionManager$1.update(ConnectionManager.java:167)
  at 
org.apache.solr.common.cloud.DefaultConnectionStrategy.reconnect(DefaultConnectionStrategy.java:57)
  at 
org.apache.solr.common.cloud.ConnectionManager.process(ConnectionManager.java:141)
  at 
org.apache.zookeeper.ClientCnxn$EventThread.processEvent(ClientCnxn.java:531)  
at org.apache.zookeeper.ClientCnxn$EventThread.run(ClientCnxn.java:506)  

Stack Trace:
java.lang.AssertionError: ObjectTracker found 1 object(s) that were not 
released!!! [Overseer]
org.apache.solr.common.util.ObjectReleaseTracker$ObjectTrackerException: 
org.apache.solr.cloud.Overseer
at 
org.apache.solr.common.util.ObjectReleaseTracker.track(ObjectReleaseTracker.java:42)
at org.apache.solr.cloud.Overseer.start(Overseer.java:545)
at 
org.apache.solr.cloud.OverseerElectionContext.runLeaderProcess(ElectionContext.java:850)
at 
org.apache.solr.cloud.LeaderElector.runIamLeaderProcess(LeaderElector.java:170)
at 
org.apache.solr.cloud.LeaderElector.checkIfIamLeader(LeaderElector.java:135)
at 
org.apache.solr.cloud.LeaderElector.joinElection(LeaderElector.java:307)
at 
org.apache.solr.cloud.LeaderElector.joinElection(LeaderElector.java:216)
at org.apache.solr.cloud.ZkController$1.command(ZkController.java:355)
at 
org.apache.solr.common.cloud.ConnectionManager$1.update(ConnectionManager.java:167)
at 
org.apache.solr.common.cloud.DefaultConnectionStrategy.reconnect(DefaultConnectionStrategy.java:57)
at 
org.apache.solr.common.cloud.ConnectionManager.process(ConnectionManager.java:141)
at 
org.apache.zookeeper.ClientCnxn$EventThread.processEvent(ClientCnxn.java:531)
at org.apache.zookeeper.ClientCnxn$EventThread.run(ClientCnxn.java:506)


at __randomizedtesting.SeedInfo.seed([C975AA7B01816844]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.assertTrue(Assert.java:43)
at org.junit.Assert.assertNull(Assert.java:551)
at 
org.apache.solr.SolrTestCaseJ4.teardownTestCases(SolrTestCaseJ4.java:303)
at sun.reflect.GeneratedMethodAccessor47.invoke(Unknown Source)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:897)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 

[jira] [Commented] (LUCENE-2899) Add OpenNLP Analysis capabilities as a module

2018-04-04 Thread Steve Rowe (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-2899?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16425492#comment-16425492
 ] 

Steve Rowe commented on LUCENE-2899:


bq. Hi, do you plan to write documentation how to work with this feature? I 
have tried to install this and get it work on SolrCloud but I have no luck.

This feature has not yet been released - Lucene/Solr 7.3 will include it when 
it's released, which will (very likely) happen today.

The 7.3 Solr reference guide, which is already online, includes some docs for 
the language analysis features added under this issue: 
[http://lucene.apache.org/solr/guide/7_3/language-analysis.html#opennlp-integration],
 Here is the Solr 7.3 javadoc, also already online, for the NER update request 
processor: 
[https://lucene.apache.org/solr/7_3_0/solr-analysis-extras/org/apache/solr/update/processor/OpenNLPExtractNamedEntitiesUpdateProcessorFactory.html]

> Add OpenNLP Analysis capabilities as a module
> -
>
> Key: LUCENE-2899
> URL: https://issues.apache.org/jira/browse/LUCENE-2899
> Project: Lucene - Core
>  Issue Type: New Feature
>  Components: modules/analysis
>Reporter: Grant Ingersoll
>Assignee: Steve Rowe
>Priority: Minor
> Fix For: 7.3, master (8.0)
>
> Attachments: LUCENE-2899-6.1.0.patch, LUCENE-2899-RJN.patch, 
> LUCENE-2899.patch, LUCENE-2899.patch, LUCENE-2899.patch, LUCENE-2899.patch, 
> LUCENE-2899.patch, LUCENE-2899.patch, OpenNLPFilter.java, 
> OpenNLPTokenizer.java
>
>
> Now that OpenNLP is an ASF project and has a nice license, it would be nice 
> to have a submodule (under analysis) that exposed capabilities for it. Drew 
> Farris, Tom Morton and I have code that does:
> * Sentence Detection as a Tokenizer (could also be a TokenFilter, although it 
> would have to change slightly to buffer tokens)
> * NamedEntity recognition as a TokenFilter
> We are also planning a Tokenizer/TokenFilter that can put parts of speech as 
> either payloads (PartOfSpeechAttribute?) on a token or at the same position.
> I'd propose it go under:
> modules/analysis/opennlp



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



WordDelimiterFilter javadocs are off base

2018-04-04 Thread Michael Sokolov
The javadocs for both WDF and WDGF include a pretty detailed discussion
about the proper use of the "combinations" parameter, but no such parameter
exists. I don't know the history here, but it sounds as if the docs might
be referring to some previous incarnation of this filter, perhaps in the
context of some (now-defunct) Solr configuration.

I think it sounds as if there is some sound wisdom underlying the advice in
the docs that is worth preserving, but it needs to be updated to match the
current state of the code. I can take a stab at rewriting, but I want to
make sure I understand the intent of the comment there.

Essentially what it is saying is that a typical usage of WD(G)F is an
asymmetric setup where splitting and subsequent token generation is done
when indexing, but something less aggreesive (at least no generation, maybe
also no splitting) is done when querying. I would probably recommend simply
omitting this filter from query-side analysis. Is there a consensus on the
best way to use this filter today?

-Mike


[JENKINS] Lucene-Solr-repro - Build # 426 - Unstable

2018-04-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-repro/426/

[...truncated 28 lines...]
[repro] Jenkins log URL: 
https://builds.apache.org/job/Lucene-Solr-NightlyTests-master/1520/consoleText

[repro] Revision: a14980c479608306aebb7255f7bb7eb64c476085

[repro] Ant options: -Dtests.multiplier=2 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/test-data/enwiki.random.lines.txt
[repro] Repro line:  ant test  -Dtestcase=StressHdfsTest -Dtests.method=test 
-Dtests.seed=378D2E79D9079884 -Dtests.multiplier=2 -Dtests.nightly=true 
-Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/test-data/enwiki.random.lines.txt
 -Dtests.locale=ro-RO -Dtests.timezone=Asia/Calcutta -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] git rev-parse --abbrev-ref HEAD
[repro] git rev-parse HEAD
[repro] Initial local git branch/revision: 
ecc17f9023309ca2c46eaf65fd031e4af0ef5a25
[repro] git fetch

[...truncated 2 lines...]
[repro] git checkout a14980c479608306aebb7255f7bb7eb64c476085

[...truncated 2 lines...]
[repro] git merge --ff-only

[...truncated 1 lines...]
[repro] ant clean

[...truncated 6 lines...]
[repro] Test suites by module:
[repro]solr/core
[repro]   StressHdfsTest
[repro] ant compile-test

[...truncated 3297 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=5 
-Dtests.class="*.StressHdfsTest" -Dtests.showOutput=onerror 
-Dtests.multiplier=2 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/test-data/enwiki.random.lines.txt
 -Dtests.seed=378D2E79D9079884 -Dtests.multiplier=2 -Dtests.nightly=true 
-Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/test-data/enwiki.random.lines.txt
 -Dtests.locale=ro-RO -Dtests.timezone=Asia/Calcutta -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[...truncated 89539 lines...]
[repro] Setting last failure code to 256

[repro] Failures:
[repro]   3/5 failed: org.apache.solr.cloud.hdfs.StressHdfsTest
[repro] git checkout ecc17f9023309ca2c46eaf65fd031e4af0ef5a25

[...truncated 2 lines...]
[repro] Exiting with code 256

[...truncated 6 lines...]

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[JENKINS] Lucene-Solr-master-MacOSX (64bit/jdk1.8.0) - Build # 4548 - Unstable!

2018-04-04 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-MacOSX/4548/
Java: 64bit/jdk1.8.0 -XX:-UseCompressedOops -XX:+UseG1GC

9 tests failed.
FAILED:  org.apache.solr.search.TestRecovery.testBuffering

Error Message:


Stack Trace:
java.lang.NullPointerException
at 
__randomizedtesting.SeedInfo.seed([A70A844170C0BA22:BAE42A6AD1991B09]:0)
at 
org.apache.solr.search.TestRecovery.testBuffering(TestRecovery.java:495)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.lang.Thread.run(Thread.java:748)


FAILED:  org.apache.solr.search.TestRecovery.testVersionsOnRestart

Error Message:


Stack Trace:
java.lang.NullPointerException
at 
__randomizedtesting.SeedInfo.seed([A70A844170C0BA22:C7A4A43F0F9E6AAD]:0)
at 
org.apache.solr.search.TestRecovery.testVersionsOnRestart(TestRecovery.java:1054)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 

  1   2   >