[JENKINS] Lucene-Solr-Tests-master - Build # 3491 - Failure
Build: https://builds.apache.org/job/Lucene-Solr-Tests-master/3491/ All tests passed Build Log: [...truncated 1332 lines...] [junit4] JVM J0: stdout was not empty, see: /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/lucene/build/core/test/temp/junit4-J0-20190804_044131_87113482318205438848343.sysout [junit4] >>> JVM J0 emitted unexpected output (verbatim) [junit4] # [junit4] # A fatal error has been detected by the Java Runtime Environment: [junit4] # [junit4] # SIGSEGV (0xb) at pc=0x7fca9dfa726c, pid=12852, tid=12901 [junit4] # [junit4] # JRE version: Java(TM) SE Runtime Environment (11.0.1+13) (build 11.0.1+13-LTS) [junit4] # Java VM: Java HotSpot(TM) 64-Bit Server VM (11.0.1+13-LTS, mixed mode, tiered, compressed oops, g1 gc, linux-amd64) [junit4] # Problematic frame: [junit4] # V [libjvm.so+0xd4026c] PhaseIdealLoop::split_up(Node*, Node*, Node*) [clone .part.39]+0x47c [junit4] # [junit4] # Core dump will be written. Default location: Core dumps may be processed with "/usr/share/apport/apport %p %s %c %d %P" (or dumping to /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/lucene/build/core/test/J0/core.12852) [junit4] # [junit4] # An error report file with more information is saved as: [junit4] # /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/lucene/build/core/test/J0/hs_err_pid12852.log [junit4] # [junit4] # Compiler replay data is saved as: [junit4] # /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/lucene/build/core/test/J0/replay_pid12852.log [junit4] # [junit4] # If you would like to submit a bug report, please visit: [junit4] # http://bugreport.java.com/bugreport/crash.jsp [junit4] # [junit4] <<< JVM J0: EOF [...truncated 735 lines...] [junit4] ERROR: JVM J0 ended with an exception, command line: /usr/local/asfpackages/java/jdk-11.0.1/bin/java -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/heapdumps -ea -esa --illegal-access=deny -Dtests.prefix=tests -Dtests.seed=7D7F4D219D7461F8 -Xmx512M -Dtests.iters= -Dtests.verbose=false -Dtests.infostream=false -Dtests.codec=random -Dtests.postingsformat=random -Dtests.docvaluesformat=random -Dtests.locale=random -Dtests.timezone=random -Dtests.directory=random -Dtests.linedocsfile=europarl.lines.txt.gz -Dtests.luceneMatchVersion=9.0.0 -Dtests.cleanthreads=perMethod -Djava.util.logging.config.file=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/lucene/tools/junit4/logging.properties -Dtests.nightly=false -Dtests.weekly=false -Dtests.monster=false -Dtests.slow=true -Dtests.asserts=true -Dtests.multiplier=2 -DtempDir=./temp -Djava.io.tmpdir=./temp -Dcommon.dir=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/lucene -Dclover.db.dir=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/lucene/build/clover/db -Djava.security.policy=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/lucene/tools/junit4/tests.policy -Dtests.LUCENE_VERSION=9.0.0 -Djetty.testMode=1 -Djetty.insecurerandom=1 -Dsolr.directoryFactory=org.apache.solr.core.MockDirectoryFactory -Djava.awt.headless=true -Djdk.map.althashing.threshold=0 -Dtests.src.home=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master -Djava.security.egd=file:/dev/./urandom -Djunit4.childvm.cwd=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/lucene/build/core/test/J0 -Djunit4.tempDir=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/lucene/build/core/test/temp -Djunit4.childvm.id=0 -Djunit4.childvm.count=3 -Djava.security.manager=org.apache.lucene.util.TestSecurityManager -Dtests.filterstacks=true -Dtests.leaveTemporary=false -Dtests.badapples=false -classpath
[JENKINS] Lucene-Solr-NightlyTests-8.x - Build # 170 - Unstable
Build: https://builds.apache.org/job/Lucene-Solr-NightlyTests-8.x/170/ 1 tests failed. FAILED: org.apache.solr.cloud.autoscaling.HdfsAutoAddReplicasIntegrationTest.testSimple Error Message: Waiting for collection testSimple1 Timeout waiting to see state for collection=testSimple1 :DocCollection(testSimple1//collections/testSimple1/state.json/31)={ "pullReplicas":"0", "replicationFactor":"2", "shards":{ "shard1":{ "range":"8000-", "state":"active", "replicas":{ "core_node3":{ "dataDir":"hdfs://lucene2-us-west.apache.org:37940/solr_hdfs_home/testSimple1/core_node3/data/", "base_url":"https://127.0.0.1:32854/solr;, "node_name":"127.0.0.1:32854_solr", "type":"NRT", "force_set_state":"false", "ulogDir":"hdfs://lucene2-us-west.apache.org:37940/solr_hdfs_home/testSimple1/core_node3/data/tlog", "core":"testSimple1_shard1_replica_n1", "shared_storage":"true", "state":"active", "leader":"true"}, "core_node5":{ "dataDir":"hdfs://lucene2-us-west.apache.org:37940/solr_hdfs_home/testSimple1/core_node5/data/", "base_url":"https://127.0.0.1:32854/solr;, "node_name":"127.0.0.1:32854_solr", "type":"NRT", "force_set_state":"false", "ulogDir":"hdfs://lucene2-us-west.apache.org:37940/solr_hdfs_home/testSimple1/core_node5/data/tlog", "core":"testSimple1_shard1_replica_n2", "shared_storage":"true", "state":"recovering"}}}, "shard2":{ "range":"0-7fff", "state":"active", "replicas":{ "core_node7":{ "dataDir":"hdfs://lucene2-us-west.apache.org:37940/solr_hdfs_home/testSimple1/core_node7/data/", "base_url":"https://127.0.0.1:32854/solr;, "node_name":"127.0.0.1:32854_solr", "type":"NRT", "force_set_state":"false", "ulogDir":"hdfs://lucene2-us-west.apache.org:37940/solr_hdfs_home/testSimple1/core_node7/data/tlog", "core":"testSimple1_shard2_replica_n4", "shared_storage":"true", "state":"active", "leader":"true"}, "core_node8":{ "dataDir":"hdfs://lucene2-us-west.apache.org:37940/solr_hdfs_home/testSimple1/core_node8/data/", "base_url":"https://127.0.0.1:32854/solr;, "node_name":"127.0.0.1:32854_solr", "type":"NRT", "force_set_state":"false", "ulogDir":"hdfs://lucene2-us-west.apache.org:37940/solr_hdfs_home/testSimple1/core_node8/data/tlog", "core":"testSimple1_shard2_replica_n6", "shared_storage":"true", "state":"active", "router":{"name":"compositeId"}, "maxShardsPerNode":"2", "autoAddReplicas":"true", "nrtReplicas":"2", "tlogReplicas":"0"} Live Nodes: [127.0.0.1:32854_solr, 127.0.0.1:37506_solr] Last available state: DocCollection(testSimple1//collections/testSimple1/state.json/31)={ "pullReplicas":"0", "replicationFactor":"2", "shards":{ "shard1":{ "range":"8000-", "state":"active", "replicas":{ "core_node3":{ "dataDir":"hdfs://lucene2-us-west.apache.org:37940/solr_hdfs_home/testSimple1/core_node3/data/", "base_url":"https://127.0.0.1:32854/solr;, "node_name":"127.0.0.1:32854_solr", "type":"NRT", "force_set_state":"false", "ulogDir":"hdfs://lucene2-us-west.apache.org:37940/solr_hdfs_home/testSimple1/core_node3/data/tlog", "core":"testSimple1_shard1_replica_n1", "shared_storage":"true", "state":"active", "leader":"true"}, "core_node5":{ "dataDir":"hdfs://lucene2-us-west.apache.org:37940/solr_hdfs_home/testSimple1/core_node5/data/", "base_url":"https://127.0.0.1:32854/solr;, "node_name":"127.0.0.1:32854_solr", "type":"NRT", "force_set_state":"false", "ulogDir":"hdfs://lucene2-us-west.apache.org:37940/solr_hdfs_home/testSimple1/core_node5/data/tlog", "core":"testSimple1_shard1_replica_n2", "shared_storage":"true", "state":"recovering"}}}, "shard2":{ "range":"0-7fff", "state":"active", "replicas":{ "core_node7":{ "dataDir":"hdfs://lucene2-us-west.apache.org:37940/solr_hdfs_home/testSimple1/core_node7/data/", "base_url":"https://127.0.0.1:32854/solr;, "node_name":"127.0.0.1:32854_solr", "type":"NRT", "force_set_state":"false", "ulogDir":"hdfs://lucene2-us-west.apache.org:37940/solr_hdfs_home/testSimple1/core_node7/data/tlog", "core":"testSimple1_shard2_replica_n4", "shared_storage":"true", "state":"active", "leader":"true"}, "core_node8":{
[JENKINS] Lucene-Solr-8.x-Windows (64bit/jdk1.8.0_201) - Build # 382 - Still Unstable!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.x-Windows/382/ Java: 64bit/jdk1.8.0_201 -XX:+UseCompressedOops -XX:+UseParallelGC 11 tests failed. FAILED: org.apache.solr.search.facet.TestCloudJSONFacetSKG.testRandom Error Message: Error from server at https://127.0.0.1:52445/solr/org.apache.solr.search.facet.TestCloudJSONFacetSKG_collection: Error from server at null: Expected mime type application/octet-stream but got text/html.Error 500 Server Error HTTP ERROR 500 Problem accessing /solr/org.apache.solr.search.facet.TestCloudJSONFacetSKG_collection_shard1_replica_n1/select. Reason: Server ErrorCaused by:java.lang.AssertionError at java.util.HashMap$TreeNode.moveRootToFront(HashMap.java:1849) at java.util.HashMap$TreeNode.putTreeVal(HashMap.java:2014) at java.util.HashMap.putVal(HashMap.java:638) at java.util.HashMap.put(HashMap.java:612) at org.apache.solr.search.LRUCache.put(LRUCache.java:201) at org.apache.solr.search.SolrCacheHolder.put(SolrCacheHolder.java:46) at org.apache.solr.search.SolrIndexSearcher.getDocListC(SolrIndexSearcher.java:1449) at org.apache.solr.search.SolrIndexSearcher.search(SolrIndexSearcher.java:568) at org.apache.solr.handler.component.QueryComponent.doProcessUngroupedSearch(QueryComponent.java:1484) at org.apache.solr.handler.component.QueryComponent.process(QueryComponent.java:398) at org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:305) at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:199) at org.apache.solr.core.SolrCore.execute(SolrCore.java:2592) at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:780) at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:566) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:423) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:350) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1610) at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:165) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1610) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1711) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1347) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1678) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1249) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144) at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:703) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132) at org.eclipse.jetty.server.Server.handle(Server.java:505) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:370) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:267) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103) at org.eclipse.jetty.io.ssl.SslConnection$DecryptedEndPoint.onFillable(SslConnection.java:427) at org.eclipse.jetty.io.ssl.SslConnection.onFillable(SslConnection.java:321) at org.eclipse.jetty.io.ssl.SslConnection$2.succeeded(SslConnection.java:159) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103) at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:781) at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:917) at java.lang.Thread.run(Thread.java:748) http://eclipse.org/jetty;>Powered by Jetty:// 9.4.19.v20190610 Stack Trace: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at https://127.0.0.1:52445/solr/org.apache.solr.search.facet.TestCloudJSONFacetSKG_collection: Error from server at null: Expected mime type application/octet-stream but got text/html. Error 500 Server Error HTTP ERROR 500 Problem accessing /solr/org.apache.solr.search.facet.TestCloudJSONFacetSKG_collection_shard1_replica_n1/select. Reason: Server ErrorCaused by:java.lang.AssertionError at
[JENKINS] Lucene-Solr-master-Linux (64bit/jdk-11.0.3) - Build # 24494 - Unstable!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/24494/ Java: 64bit/jdk-11.0.3 -XX:-UseCompressedOops -XX:+UseG1GC 3 tests failed. FAILED: org.apache.solr.search.facet.TestCloudJSONFacetSKG.testRandom Error Message: No live SolrServers available to handle this request:[https://127.0.0.1:40473/solr/org.apache.solr.search.facet.TestCloudJSONFacetSKG_collection] Stack Trace: org.apache.solr.client.solrj.SolrServerException: No live SolrServers available to handle this request:[https://127.0.0.1:40473/solr/org.apache.solr.search.facet.TestCloudJSONFacetSKG_collection] at __randomizedtesting.SeedInfo.seed([CE8BEDD326CA02D3:BCC7C8DC97AAB4A0]:0) at org.apache.solr.client.solrj.impl.LBSolrClient.request(LBSolrClient.java:345) at org.apache.solr.client.solrj.impl.BaseCloudSolrClient.sendRequest(BaseCloudSolrClient.java:1128) at org.apache.solr.client.solrj.impl.BaseCloudSolrClient.requestWithRetryOnStaleState(BaseCloudSolrClient.java:897) at org.apache.solr.client.solrj.impl.BaseCloudSolrClient.request(BaseCloudSolrClient.java:829) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:211) at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:987) at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1002) at org.apache.solr.search.facet.TestCloudJSONFacetSKG.getNumFound(TestCloudJSONFacetSKG.java:669) at org.apache.solr.search.facet.TestCloudJSONFacetSKG.verifySKGResults(TestCloudJSONFacetSKG.java:446) at org.apache.solr.search.facet.TestCloudJSONFacetSKG.assertFacetSKGsAreCorrect(TestCloudJSONFacetSKG.java:392) at org.apache.solr.search.facet.TestCloudJSONFacetSKG.assertFacetSKGsAreCorrect(TestCloudJSONFacetSKG.java:402) at org.apache.solr.search.facet.TestCloudJSONFacetSKG.assertFacetSKGsAreCorrect(TestCloudJSONFacetSKG.java:402) at org.apache.solr.search.facet.TestCloudJSONFacetSKG.assertFacetSKGsAreCorrect(TestCloudJSONFacetSKG.java:349) at org.apache.solr.search.facet.TestCloudJSONFacetSKG.testRandom(TestCloudJSONFacetSKG.java:274) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at
[JENKINS] Lucene-Solr-NightlyTests-master - Build # 1919 - Still Failing
Build: https://builds.apache.org/job/Lucene-Solr-NightlyTests-master/1919/ No tests ran. Build Log: [...truncated 25 lines...] ERROR: Failed to check out http://svn.apache.org/repos/asf/lucene/test-data org.tmatesoft.svn.core.SVNException: svn: E175002: connection refused by the server svn: E175002: OPTIONS request failed on '/repos/asf/lucene/test-data' at org.tmatesoft.svn.core.internal.wc.SVNErrorManager.error(SVNErrorManager.java:112) at org.tmatesoft.svn.core.internal.wc.SVNErrorManager.error(SVNErrorManager.java:96) at org.tmatesoft.svn.core.internal.io.dav.http.HTTPConnection.request(HTTPConnection.java:765) at org.tmatesoft.svn.core.internal.io.dav.http.HTTPConnection.request(HTTPConnection.java:352) at org.tmatesoft.svn.core.internal.io.dav.http.HTTPConnection.request(HTTPConnection.java:340) at org.tmatesoft.svn.core.internal.io.dav.DAVConnection.performHttpRequest(DAVConnection.java:910) at org.tmatesoft.svn.core.internal.io.dav.DAVConnection.exchangeCapabilities(DAVConnection.java:702) at org.tmatesoft.svn.core.internal.io.dav.DAVConnection.open(DAVConnection.java:113) at org.tmatesoft.svn.core.internal.io.dav.DAVRepository.openConnection(DAVRepository.java:1035) at org.tmatesoft.svn.core.internal.io.dav.DAVRepository.getLatestRevision(DAVRepository.java:164) at org.tmatesoft.svn.core.internal.wc2.ng.SvnNgRepositoryAccess.getRevisionNumber(SvnNgRepositoryAccess.java:119) at org.tmatesoft.svn.core.internal.wc2.SvnRepositoryAccess.getLocations(SvnRepositoryAccess.java:178) at org.tmatesoft.svn.core.internal.wc2.ng.SvnNgRepositoryAccess.createRepositoryFor(SvnNgRepositoryAccess.java:43) at org.tmatesoft.svn.core.internal.wc2.ng.SvnNgAbstractUpdate.checkout(SvnNgAbstractUpdate.java:831) at org.tmatesoft.svn.core.internal.wc2.ng.SvnNgCheckout.run(SvnNgCheckout.java:26) at org.tmatesoft.svn.core.internal.wc2.ng.SvnNgCheckout.run(SvnNgCheckout.java:11) at org.tmatesoft.svn.core.internal.wc2.ng.SvnNgOperationRunner.run(SvnNgOperationRunner.java:20) at org.tmatesoft.svn.core.internal.wc2.SvnOperationRunner.run(SvnOperationRunner.java:21) at org.tmatesoft.svn.core.wc2.SvnOperationFactory.run(SvnOperationFactory.java:1239) at org.tmatesoft.svn.core.wc2.SvnOperation.run(SvnOperation.java:294) at hudson.scm.subversion.CheckoutUpdater$SubversionUpdateTask.perform(CheckoutUpdater.java:133) at hudson.scm.subversion.WorkspaceUpdater$UpdateTask.delegateTo(WorkspaceUpdater.java:168) at hudson.scm.subversion.WorkspaceUpdater$UpdateTask.delegateTo(WorkspaceUpdater.java:176) at hudson.scm.subversion.UpdateUpdater$TaskImpl.perform(UpdateUpdater.java:134) at hudson.scm.subversion.WorkspaceUpdater$UpdateTask.delegateTo(WorkspaceUpdater.java:168) at hudson.scm.SubversionSCM$CheckOutTask.perform(SubversionSCM.java:1041) at hudson.scm.SubversionSCM$CheckOutTask.invoke(SubversionSCM.java:1017) at hudson.scm.SubversionSCM$CheckOutTask.invoke(SubversionSCM.java:990) at hudson.FilePath$FileCallableWrapper.call(FilePath.java:3086) at hudson.remoting.UserRequest.perform(UserRequest.java:212) at hudson.remoting.UserRequest.perform(UserRequest.java:54) at hudson.remoting.Request$2.run(Request.java:369) at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:744) Caused by: java.net.ConnectException: Connection refused at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:345) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) at java.net.Socket.connect(Socket.java:589) at org.tmatesoft.svn.core.internal.util.SVNSocketConnection.run(SVNSocketConnection.java:57) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ... 4 more java.net.ConnectException: Connection refused at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:345) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
[JENKINS] Lucene-Solr-8.x-Linux (32bit/jdk1.8.0_201) - Build # 966 - Unstable!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.x-Linux/966/ Java: 32bit/jdk1.8.0_201 -client -XX:+UseSerialGC 1 tests failed. FAILED: org.apache.solr.search.facet.TestCloudJSONFacetJoinDomain.testRandom Error Message: No live SolrServers available to handle this request:[https://127.0.0.1:36423/solr/org.apache.solr.search.facet.TestCloudJSONFacetJoinDomain_collection] Stack Trace: org.apache.solr.client.solrj.SolrServerException: No live SolrServers available to handle this request:[https://127.0.0.1:36423/solr/org.apache.solr.search.facet.TestCloudJSONFacetJoinDomain_collection] at __randomizedtesting.SeedInfo.seed([195114124241EF7D:6B1D311DF321590E]:0) at org.apache.solr.client.solrj.impl.LBSolrClient.request(LBSolrClient.java:345) at org.apache.solr.client.solrj.impl.BaseCloudSolrClient.sendRequest(BaseCloudSolrClient.java:1128) at org.apache.solr.client.solrj.impl.BaseCloudSolrClient.requestWithRetryOnStaleState(BaseCloudSolrClient.java:897) at org.apache.solr.client.solrj.impl.BaseCloudSolrClient.request(BaseCloudSolrClient.java:829) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:211) at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:987) at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1002) at org.apache.solr.search.facet.TestCloudJSONFacetJoinDomain.assertFacetCountsAreCorrect(TestCloudJSONFacetJoinDomain.java:504) at org.apache.solr.search.facet.TestCloudJSONFacetJoinDomain.assertFacetCountsAreCorrect(TestCloudJSONFacetJoinDomain.java:512) at org.apache.solr.search.facet.TestCloudJSONFacetJoinDomain.assertFacetCountsAreCorrect(TestCloudJSONFacetJoinDomain.java:462) at org.apache.solr.search.facet.TestCloudJSONFacetJoinDomain.testRandom(TestCloudJSONFacetJoinDomain.java:401) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at
[JENKINS] Lucene-Solr-master-Windows (64bit/jdk-12.0.1) - Build # 8069 - Still Unstable!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Windows/8069/ Java: 64bit/jdk-12.0.1 -XX:+UseCompressedOops -XX:+UseSerialGC 10 tests failed. FAILED: org.apache.solr.security.BasicAuthIntegrationTest.testBasicAuth Error Message: Software caused connection abort: recv failed Stack Trace: javax.net.ssl.SSLException: Software caused connection abort: recv failed at __randomizedtesting.SeedInfo.seed([ABCF99184454E94C:17A1EF0AE0076A36]:0) at java.base/sun.security.ssl.Alert.createSSLException(Alert.java:127) at java.base/sun.security.ssl.TransportContext.fatal(TransportContext.java:320) at java.base/sun.security.ssl.TransportContext.fatal(TransportContext.java:263) at java.base/sun.security.ssl.TransportContext.fatal(TransportContext.java:258) at java.base/sun.security.ssl.SSLSocketImpl.handleException(SSLSocketImpl.java:1342) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:844) at org.apache.http.impl.io.SessionInputBufferImpl.streamRead(SessionInputBufferImpl.java:137) at org.apache.http.impl.io.SessionInputBufferImpl.fillBuffer(SessionInputBufferImpl.java:153) at org.apache.http.impl.io.SessionInputBufferImpl.readLine(SessionInputBufferImpl.java:282) at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:138) at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:56) at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:259) at org.apache.http.impl.DefaultBHttpClientConnection.receiveResponseHeader(DefaultBHttpClientConnection.java:163) at org.apache.http.impl.conn.CPoolProxy.receiveResponseHeader(CPoolProxy.java:165) at org.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:273) at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:125) at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:272) at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185) at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89) at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110) at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185) at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83) at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108) at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56) at org.apache.solr.cloud.SolrCloudAuthTestCase.verifySecurityStatus(SolrCloudAuthTestCase.java:200) at org.apache.solr.cloud.SolrCloudAuthTestCase.verifySecurityStatus(SolrCloudAuthTestCase.java:176) at org.apache.solr.security.BasicAuthIntegrationTest.testBasicAuth(BasicAuthIntegrationTest.java:127) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:567) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
[JENKINS] Lucene-Solr-Tests-master - Build # 3489 - Failure
Build: https://builds.apache.org/job/Lucene-Solr-Tests-master/3489/ All tests passed Build Log: [...truncated 64552 lines...] -ecj-javadoc-lint-tests: [mkdir] Created dir: /tmp/ecj1165784136 [ecj-lint] Compiling 48 source files to /tmp/ecj1165784136 [ecj-lint] invalid Class-Path header in manifest of jar file: /home/jenkins/.ivy2/cache/org.restlet.jee/org.restlet/jars/org.restlet-2.3.0.jar [ecj-lint] invalid Class-Path header in manifest of jar file: /home/jenkins/.ivy2/cache/org.restlet.jee/org.restlet.ext.servlet/jars/org.restlet.ext.servlet-2.3.0.jar [ecj-lint] -- [ecj-lint] 1. ERROR in /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/solr/contrib/dataimporthandler/src/test/org/apache/solr/handler/dataimport/MockInitialContextFactory.java (at line 23) [ecj-lint] import javax.naming.NamingException; [ecj-lint] [ecj-lint] The type javax.naming.NamingException is not accessible [ecj-lint] -- [ecj-lint] 2. ERROR in /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/solr/contrib/dataimporthandler/src/test/org/apache/solr/handler/dataimport/MockInitialContextFactory.java (at line 28) [ecj-lint] public class MockInitialContextFactory implements InitialContextFactory { [ecj-lint] ^ [ecj-lint] The type MockInitialContextFactory must implement the inherited abstract method InitialContextFactory.getInitialContext(Hashtable) [ecj-lint] -- [ecj-lint] 3. ERROR in /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/solr/contrib/dataimporthandler/src/test/org/apache/solr/handler/dataimport/MockInitialContextFactory.java (at line 30) [ecj-lint] private final javax.naming.Context context; [ecj-lint] [ecj-lint] The type javax.naming.Context is not accessible [ecj-lint] -- [ecj-lint] 4. ERROR in /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/solr/contrib/dataimporthandler/src/test/org/apache/solr/handler/dataimport/MockInitialContextFactory.java (at line 33) [ecj-lint] context = mock(javax.naming.Context.class); [ecj-lint] ^^^ [ecj-lint] context cannot be resolved to a variable [ecj-lint] -- [ecj-lint] 5. ERROR in /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/solr/contrib/dataimporthandler/src/test/org/apache/solr/handler/dataimport/MockInitialContextFactory.java (at line 33) [ecj-lint] context = mock(javax.naming.Context.class); [ecj-lint] [ecj-lint] The type javax.naming.Context is not accessible [ecj-lint] -- [ecj-lint] 6. ERROR in /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/solr/contrib/dataimporthandler/src/test/org/apache/solr/handler/dataimport/MockInitialContextFactory.java (at line 36) [ecj-lint] when(context.lookup(anyString())).thenAnswer(invocation -> objects.get(invocation.getArgument(0))); [ecj-lint] ^^^ [ecj-lint] context cannot be resolved [ecj-lint] -- [ecj-lint] 7. ERROR in /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/solr/contrib/dataimporthandler/src/test/org/apache/solr/handler/dataimport/MockInitialContextFactory.java (at line 38) [ecj-lint] } catch (NamingException e) { [ecj-lint] ^^^ [ecj-lint] NamingException cannot be resolved to a type [ecj-lint] -- [ecj-lint] 8. ERROR in /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/solr/contrib/dataimporthandler/src/test/org/apache/solr/handler/dataimport/MockInitialContextFactory.java (at line 45) [ecj-lint] public javax.naming.Context getInitialContext(Hashtable env) { [ecj-lint] [ecj-lint] The type javax.naming.Context is not accessible [ecj-lint] -- [ecj-lint] 9. ERROR in /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/solr/contrib/dataimporthandler/src/test/org/apache/solr/handler/dataimport/MockInitialContextFactory.java (at line 46) [ecj-lint] return context; [ecj-lint]^^^ [ecj-lint] context cannot be resolved to a variable [ecj-lint] -- [ecj-lint] 9 problems (9 errors) BUILD FAILED /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/build.xml:634: The following error occurred while executing this line: /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/build.xml:101: The following error occurred while executing this line: /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/solr/build.xml:651: The following error occurred while executing this line: /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/solr/common-build.xml:479: The following error occurred while executing this line: /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/lucene/common-build.xml:2015: The following error occurred while executing this line:
[jira] [Updated] (SOLR-13399) compositeId support for shard splitting
[ https://issues.apache.org/jira/browse/SOLR-13399?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yonik Seeley updated SOLR-13399: Attachment: SOLR-13399_useId.patch Status: Reopened (was: Reopened) Here's an enhancement that uses the "id" field for histogram generation if there is nothing found in the "id_prefix" field. > compositeId support for shard splitting > --- > > Key: SOLR-13399 > URL: https://issues.apache.org/jira/browse/SOLR-13399 > Project: Solr > Issue Type: New Feature >Reporter: Yonik Seeley >Assignee: Yonik Seeley >Priority: Major > Fix For: 8.3 > > Attachments: SOLR-13399.patch, SOLR-13399.patch, > SOLR-13399_testfix.patch, SOLR-13399_useId.patch > > > Shard splitting does not currently have a way to automatically take into > account the actual distribution (number of documents) in each hash bucket > created by using compositeId hashing. > We should probably add a parameter *splitByPrefix* to the *SPLITSHARD* > command that would look at the number of docs sharing each compositeId prefix > and use that to create roughly equal sized buckets by document count rather > than just assuming an equal distribution across the entire hash range. > Like normal shard splitting, we should bias against splitting within hash > buckets unless necessary (since that leads to larger query fanout.) . Perhaps > this warrants a parameter that would control how much of a size mismatch is > tolerable before resorting to splitting within a bucket. > *allowedSizeDifference*? > To more quickly calculate the number of docs in each bucket, we could index > the prefix in a different field. Iterating over the terms for this field > would quickly give us the number of docs in each (i.e lucene keeps track of > the doc count for each term already.) Perhaps the implementation could be a > flag on the *id* field... something like *indexPrefixes* and poly-fields that > would cause the indexing to be automatically done and alleviate having to > pass in an additional field during indexing and during the call to > *SPLITSHARD*. This whole part is an optimization though and could be split > off into its own issue if desired. > > -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-master-Linux (64bit/jdk-11.0.3) - Build # 24493 - Failure!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/24493/ Java: 64bit/jdk-11.0.3 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC All tests passed Build Log: [...truncated 697 lines...] [junit4] JVM J2: stdout was not empty, see: /home/jenkins/workspace/Lucene-Solr-master-Linux/lucene/build/core/test/temp/junit4-J2-20190803_203122_3071683110053458567.sysout [junit4] >>> JVM J2 emitted unexpected output (verbatim) [junit4] # [junit4] # A fatal error has been detected by the Java Runtime Environment: [junit4] # [junit4] # SIGSEGV (0xb) at pc=0x7fca53e7bc7c, pid=12921, tid=14856 [junit4] # [junit4] # JRE version: OpenJDK Runtime Environment (11.0.3+7) (build 11.0.3+7) [junit4] # Java VM: OpenJDK 64-Bit Server VM (11.0.3+7, mixed mode, tiered, compressed oops, concurrent mark sweep gc, linux-amd64) [junit4] # Problematic frame: [junit4] # V [libjvm.so+0xd05c7c] PhaseIdealLoop::split_up(Node*, Node*, Node*) [clone .part.39]+0x47c [junit4] # [junit4] # No core dump will be written. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again [junit4] # [junit4] # An error report file with more information is saved as: [junit4] # /home/jenkins/workspace/Lucene-Solr-master-Linux/lucene/build/core/test/J2/hs_err_pid12921.log [junit4] # [junit4] # Compiler replay data is saved as: [junit4] # /home/jenkins/workspace/Lucene-Solr-master-Linux/lucene/build/core/test/J2/replay_pid12921.log [junit4] # [junit4] # If you would like to submit a bug report, please visit: [junit4] # https://github.com/AdoptOpenJDK/openjdk-build/issues [junit4] # [junit4] <<< JVM J2: EOF [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-master-Linux/lucene/build/core/test/temp/junit4-J2-20190803_203122_3072763273816315985830.syserr [junit4] >>> JVM J2 emitted unexpected output (verbatim) [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release. [junit4] <<< JVM J2: EOF [...truncated 1206 lines...] [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-master-Linux/lucene/build/core/test/temp/junit4-J0-20190803_203122_30716284346697956885092.syserr [junit4] >>> JVM J0 emitted unexpected output (verbatim) [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release. [junit4] <<< JVM J0: EOF [...truncated 5 lines...] [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-master-Linux/lucene/build/core/test/temp/junit4-J1-20190803_203122_30714744437072321401193.syserr [junit4] >>> JVM J1 emitted unexpected output (verbatim) [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release. [junit4] <<< JVM J1: EOF [...truncated 3 lines...] [junit4] ERROR: JVM J2 ended with an exception, command line: /home/jenkins/tools/java/64bit/jdk-11.0.3/bin/java -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/home/jenkins/workspace/Lucene-Solr-master-Linux/heapdumps -ea -esa --illegal-access=deny -Dtests.prefix=tests -Dtests.seed=FE621E82775C25E -Xmx512M -Dtests.iters= -Dtests.verbose=false -Dtests.infostream=false -Dtests.codec=random -Dtests.postingsformat=random -Dtests.docvaluesformat=random -Dtests.locale=random -Dtests.timezone=random -Dtests.directory=random -Dtests.linedocsfile=europarl.lines.txt.gz -Dtests.luceneMatchVersion=9.0.0 -Dtests.cleanthreads=perMethod -Djava.util.logging.config.file=/home/jenkins/workspace/Lucene-Solr-master-Linux/lucene/tools/junit4/logging.properties -Dtests.nightly=false -Dtests.weekly=false -Dtests.monster=false -Dtests.slow=true -Dtests.asserts=true -Dtests.multiplier=3 -DtempDir=./temp -Djava.io.tmpdir=./temp -Dcommon.dir=/home/jenkins/workspace/Lucene-Solr-master-Linux/lucene -Dclover.db.dir=/home/jenkins/workspace/Lucene-Solr-master-Linux/lucene/build/clover/db -Djava.security.policy=/home/jenkins/workspace/Lucene-Solr-master-Linux/lucene/tools/junit4/tests.policy -Dtests.LUCENE_VERSION=9.0.0 -Djetty.testMode=1 -Djetty.insecurerandom=1 -Dsolr.directoryFactory=org.apache.solr.core.MockDirectoryFactory -Djava.awt.headless=true -Djdk.map.althashing.threshold=0 -Dtests.src.home=/home/jenkins/workspace/Lucene-Solr-master-Linux -Djava.security.egd=file:/dev/./urandom -Djunit4.childvm.cwd=/home/jenkins/workspace/Lucene-Solr-master-Linux/lucene/build/core/test/J2 -Djunit4.tempDir=/home/jenkins/workspace/Lucene-Solr-master-Linux/lucene/build/core/test/temp -Djunit4.childvm.id=2 -Djunit4.childvm.count=3 -Dfile.encoding=ISO-8859-1
[jira] [Commented] (SOLR-13399) compositeId support for shard splitting
[ https://issues.apache.org/jira/browse/SOLR-13399?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16899487#comment-16899487 ] ASF subversion and git services commented on SOLR-13399: Commit 5b76555dace0a78ba219813be1740d5e14c9c0c7 in lucene-solr's branch refs/heads/branch_8x from Yonik Seeley [ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=5b76555 ] SOLR-13399: fix splitByPrefix test > compositeId support for shard splitting > --- > > Key: SOLR-13399 > URL: https://issues.apache.org/jira/browse/SOLR-13399 > Project: Solr > Issue Type: New Feature >Reporter: Yonik Seeley >Assignee: Yonik Seeley >Priority: Major > Fix For: 8.3 > > Attachments: SOLR-13399.patch, SOLR-13399.patch, > SOLR-13399_testfix.patch > > > Shard splitting does not currently have a way to automatically take into > account the actual distribution (number of documents) in each hash bucket > created by using compositeId hashing. > We should probably add a parameter *splitByPrefix* to the *SPLITSHARD* > command that would look at the number of docs sharing each compositeId prefix > and use that to create roughly equal sized buckets by document count rather > than just assuming an equal distribution across the entire hash range. > Like normal shard splitting, we should bias against splitting within hash > buckets unless necessary (since that leads to larger query fanout.) . Perhaps > this warrants a parameter that would control how much of a size mismatch is > tolerable before resorting to splitting within a bucket. > *allowedSizeDifference*? > To more quickly calculate the number of docs in each bucket, we could index > the prefix in a different field. Iterating over the terms for this field > would quickly give us the number of docs in each (i.e lucene keeps track of > the doc count for each term already.) Perhaps the implementation could be a > flag on the *id* field... something like *indexPrefixes* and poly-fields that > would cause the indexing to be automatically done and alleviate having to > pass in an additional field during indexing and during the call to > *SPLITSHARD*. This whole part is an optimization though and could be split > off into its own issue if desired. > > -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-13399) compositeId support for shard splitting
[ https://issues.apache.org/jira/browse/SOLR-13399?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16899486#comment-16899486 ] ASF subversion and git services commented on SOLR-13399: Commit b6c26f6c16130fd7ec9216b4f8798dc22aacb534 in lucene-solr's branch refs/heads/master from Yonik Seeley [ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=b6c26f6 ] SOLR-13399: fix splitByPrefix test > compositeId support for shard splitting > --- > > Key: SOLR-13399 > URL: https://issues.apache.org/jira/browse/SOLR-13399 > Project: Solr > Issue Type: New Feature >Reporter: Yonik Seeley >Assignee: Yonik Seeley >Priority: Major > Fix For: 8.3 > > Attachments: SOLR-13399.patch, SOLR-13399.patch, > SOLR-13399_testfix.patch > > > Shard splitting does not currently have a way to automatically take into > account the actual distribution (number of documents) in each hash bucket > created by using compositeId hashing. > We should probably add a parameter *splitByPrefix* to the *SPLITSHARD* > command that would look at the number of docs sharing each compositeId prefix > and use that to create roughly equal sized buckets by document count rather > than just assuming an equal distribution across the entire hash range. > Like normal shard splitting, we should bias against splitting within hash > buckets unless necessary (since that leads to larger query fanout.) . Perhaps > this warrants a parameter that would control how much of a size mismatch is > tolerable before resorting to splitting within a bucket. > *allowedSizeDifference*? > To more quickly calculate the number of docs in each bucket, we could index > the prefix in a different field. Iterating over the terms for this field > would quickly give us the number of docs in each (i.e lucene keeps track of > the doc count for each term already.) Perhaps the implementation could be a > flag on the *id* field... something like *indexPrefixes* and poly-fields that > would cause the indexing to be automatically done and alleviate having to > pass in an additional field during indexing and during the call to > *SPLITSHARD*. This whole part is an optimization though and could be split > off into its own issue if desired. > > -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-13105) A visual guide to Solr Math Expressions and Streaming Expressions
[ https://issues.apache.org/jira/browse/SOLR-13105?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16899477#comment-16899477 ] ASF subversion and git services commented on SOLR-13105: Commit b9566c9deb77d70f5340dafbfcd994eddf75f695 in lucene-solr's branch refs/heads/SOLR-13105-visual from Joel Bernstein [ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=b9566c9 ] SOLR-13105: Add loading page > A visual guide to Solr Math Expressions and Streaming Expressions > - > > Key: SOLR-13105 > URL: https://issues.apache.org/jira/browse/SOLR-13105 > Project: Solr > Issue Type: New Feature >Reporter: Joel Bernstein >Assignee: Joel Bernstein >Priority: Major > Attachments: Screen Shot 2019-01-14 at 10.56.32 AM.png, Screen Shot > 2019-02-21 at 2.14.43 PM.png, Screen Shot 2019-03-03 at 2.28.35 PM.png, > Screen Shot 2019-03-04 at 7.47.57 PM.png, Screen Shot 2019-03-13 at 10.47.47 > AM.png, Screen Shot 2019-03-30 at 6.17.04 PM.png > > > Visualization is now a fundamental element of Solr Streaming Expressions and > Math Expressions. This ticket will create a visual guide to Solr Math > Expressions and Solr Streaming Expressions that includes *Apache Zeppelin* > visualization examples. > It will also cover using the JDBC expression to *analyze* and *visualize* > results from any JDBC compliant data source. > Intro from the guide: > {code:java} > Streaming Expressions exposes the capabilities of Solr Cloud as composable > functions. These functions provide a system for searching, transforming, > analyzing and visualizing data stored in Solr Cloud collections. > At a high level there are four main capabilities that will be explored in the > documentation: > * Searching, sampling and aggregating results from Solr. > * Transforming result sets after they are retrieved from Solr. > * Analyzing and modeling result sets using probability and statistics and > machine learning libraries. > * Visualizing result sets, aggregations and statistical models of the data. > {code} > > A few sample visualizations are attached to the ticket. -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-Tests-8.x - Build # 344 - Still Unstable
Build: https://builds.apache.org/job/Lucene-Solr-Tests-8.x/344/ 1 tests failed. FAILED: org.apache.solr.handler.admin.StatsReloadRaceTest.testParallelReloadAndStats Error Message: Key SEARCHER.searcher.indexVersion not found in registry solr.core.collection1 Stack Trace: java.lang.AssertionError: Key SEARCHER.searcher.indexVersion not found in registry solr.core.collection1 at __randomizedtesting.SeedInfo.seed([49C8C2382EBA0661:8656A701A14B6E3E]:0) at org.junit.Assert.fail(Assert.java:88) at org.junit.Assert.assertTrue(Assert.java:41) at org.apache.solr.handler.admin.StatsReloadRaceTest.requestMetrics(StatsReloadRaceTest.java:143) at org.apache.solr.handler.admin.StatsReloadRaceTest.testParallelReloadAndStats(StatsReloadRaceTest.java:77) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at java.lang.Thread.run(Thread.java:748) Build Log: [...truncated 14013 lines...] [junit4] Suite: org.apache.solr.handler.admin.StatsReloadRaceTest [junit4]
[jira] [Commented] (SOLR-13679) Different default Style for [explain] doctransformer registered in solrconfig.xml
[ https://issues.apache.org/jira/browse/SOLR-13679?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16899472#comment-16899472 ] Lucene/Solr QA commented on SOLR-13679: --- | (/) *{color:green}+1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || || || || || {color:brown} Prechecks {color} || | {color:green}+1{color} | {color:green} test4tests {color} | {color:green} 0m 0s{color} | {color:green} The patch appears to include 1 new or modified test files. {color} | || || || || {color:brown} master Compile Tests {color} || | {color:green}+1{color} | {color:green} compile {color} | {color:green} 2m 4s{color} | {color:green} master passed {color} | || || || || {color:brown} Patch Compile Tests {color} || | {color:green}+1{color} | {color:green} compile {color} | {color:green} 2m 8s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 2m 8s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} Release audit (RAT) {color} | {color:green} 2m 8s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} Check forbidden APIs {color} | {color:green} 2m 8s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} Validate source patterns {color} | {color:green} 2m 8s{color} | {color:green} the patch passed {color} | || || || || {color:brown} Other Tests {color} || | {color:green}+1{color} | {color:green} unit {color} | {color:green} 31m 55s{color} | {color:green} core in the patch passed. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 39m 13s{color} | {color:black} {color} | \\ \\ || Subsystem || Report/Notes || | JIRA Issue | SOLR-13679 | | JIRA Patch URL | https://issues.apache.org/jira/secure/attachment/12976616/SOLR-13679.patch | | Optional Tests | compile javac unit ratsources checkforbiddenapis validatesourcepatterns | | uname | Linux lucene1-us-west 4.15.0-54-generic #58-Ubuntu SMP Mon Jun 24 10:55:24 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | ant | | Personality | /home/jenkins/jenkins-slave/workspace/PreCommit-SOLR-Build/sourcedir/dev-tools/test-patch/lucene-solr-yetus-personality.sh | | git revision | master / ff7b0c9de5 | | ant | version: Apache Ant(TM) version 1.10.5 compiled on March 28 2019 | | Default Java | LTS | | Test Results | https://builds.apache.org/job/PreCommit-SOLR-Build/521/testReport/ | | modules | C: solr/core U: solr/core | | Console output | https://builds.apache.org/job/PreCommit-SOLR-Build/521/console | | Powered by | Apache Yetus 0.7.0 http://yetus.apache.org | This message was automatically generated. > Different default Style for [explain] doctransformer registered in > solrconfig.xml > -- > > Key: SOLR-13679 > URL: https://issues.apache.org/jira/browse/SOLR-13679 > Project: Solr > Issue Type: Bug > Security Level: Public(Default Security Level. Issues are Public) >Reporter: Munendra S N >Assignee: Munendra S N >Priority: Minor > Attachments: SOLR-13679.patch, SOLR-13679.patch > > > Adding explain docTransformer via solrconfig.xml > {code:java} > class="org.apache.solr.response.transform.ExplainAugmenterFactory" /> > {code} > Here, no style is specified. So, default style is used which is {{nl}}. > ExplainDocTransformer is part of defaultFactories. So, user can use this > without adding it in solrconfig.xml but when used this way, default style > used is {{text}} > Default style should be same in both cases. This behavior is same to > registering ResponseWriters in solrconfig, if content-type is not overridden > then, default content-type will used -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-13293) org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient - Error consuming and closing http response stream.
[ https://issues.apache.org/jira/browse/SOLR-13293?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16899459#comment-16899459 ] Alexander S. commented on SOLR-13293: - I just upgraded from Solr 5 to 8 and also seeing these errors. > org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient - Error > consuming and closing http response stream. > - > > Key: SOLR-13293 > URL: https://issues.apache.org/jira/browse/SOLR-13293 > Project: Solr > Issue Type: Bug > Components: SolrJ >Affects Versions: 8.0 >Reporter: Karl Stoney >Priority: Minor > > Hi, > Testing out branch_8x, we're randomly seeing the following errors on a simple > 3 node cluster. It doesn't appear to affect replication (the cluster remains > green). > They come in (mass, literally 1000s at a time) bulk. > There we no network issues at the time. > {code:java} > 16:53:01.492 [updateExecutor-4-thread-34-processing-x:at-uk_shard1_replica_n1 > r:core_node3 null n:solr-2.search-solr.preprod.k8.atcloud.io:80_solr c:at-uk > s:shard1] ERROR > org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient - Error > consuming and closing http response stream. > java.nio.channels.AsynchronousCloseException: null > at > org.eclipse.jetty.client.util.InputStreamResponseListener$Input.read(InputStreamResponseListener.java:316) > ~[jetty-client-9.4.14.v20181114.jar:9.4.14.v20181114] > at java.io.InputStream.read(InputStream.java:101) ~[?:1.8.0_191] > at > org.eclipse.jetty.client.util.InputStreamResponseListener$Input.read(InputStreamResponseListener.java:287) > ~[jetty-client-9.4.14.v20181114.jar:9.4.14.v20181114] > at > org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:283) > ~[solr-solrj-8.1.0-SNAPSHOT.jar:8.1.0-SNAPSHOT > b14748e61fd147ea572f6545265b883fa69ed27f - root > - 2019-03-04 16:30:04] > at > org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:176) > ~[solr-solrj-8.1.0-SNAPSHOT.jar:8.1.0-SNAPSHOT > b14748e61fd147ea572f6545265b883fa69ed27f - root - 2019-03-04 > 16:30:04] > at > com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:176) > ~[metrics-core-3.2.6.jar:3.2.6] > at > org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:209) > ~[solr-solrj-8.1.0-SNAPSHOT.jar:8.1.0-SNAPSHOT > b14748e61fd147ea572f6545265b883fa69ed27f - root - 2019-03-04 16:30:04] > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) > [?:1.8.0_191] > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) > [?:1.8.0_191] > at java.lang.Thread.run(Thread.java:748) [?:1.8.0_191] > {code} -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-13679) Different default Style for [explain] doctransformer registered in solrconfig.xml
[ https://issues.apache.org/jira/browse/SOLR-13679?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16899458#comment-16899458 ] Munendra S N commented on SOLR-13679: - [^SOLR-13679.patch] Change the default Style to {{text}} the value used in case of docTransformer registered in {{defaultFactories}} > Different default Style for [explain] doctransformer registered in > solrconfig.xml > -- > > Key: SOLR-13679 > URL: https://issues.apache.org/jira/browse/SOLR-13679 > Project: Solr > Issue Type: Bug > Security Level: Public(Default Security Level. Issues are Public) >Reporter: Munendra S N >Assignee: Munendra S N >Priority: Minor > Attachments: SOLR-13679.patch, SOLR-13679.patch > > > Adding explain docTransformer via solrconfig.xml > {code:java} > class="org.apache.solr.response.transform.ExplainAugmenterFactory" /> > {code} > Here, no style is specified. So, default style is used which is {{nl}}. > ExplainDocTransformer is part of defaultFactories. So, user can use this > without adding it in solrconfig.xml but when used this way, default style > used is {{text}} > Default style should be same in both cases. This behavior is same to > registering ResponseWriters in solrconfig, if content-type is not overridden > then, default content-type will used -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (SOLR-13679) Different default Style for [explain] doctransformer registered in solrconfig.xml
[ https://issues.apache.org/jira/browse/SOLR-13679?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Munendra S N updated SOLR-13679: Attachment: SOLR-13679.patch > Different default Style for [explain] doctransformer registered in > solrconfig.xml > -- > > Key: SOLR-13679 > URL: https://issues.apache.org/jira/browse/SOLR-13679 > Project: Solr > Issue Type: Bug > Security Level: Public(Default Security Level. Issues are Public) >Reporter: Munendra S N >Assignee: Munendra S N >Priority: Minor > Attachments: SOLR-13679.patch, SOLR-13679.patch > > > Adding explain docTransformer via solrconfig.xml > {code:java} > class="org.apache.solr.response.transform.ExplainAugmenterFactory" /> > {code} > Here, no style is specified. So, default style is used which is {{nl}}. > ExplainDocTransformer is part of defaultFactories. So, user can use this > without adding it in solrconfig.xml but when used this way, default style > used is {{text}} > Default style should be same in both cases. This behavior is same to > registering ResponseWriters in solrconfig, if content-type is not overridden > then, default content-type will used -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-master-Linux (64bit/jdk-11.0.3) - Build # 24491 - Unstable!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/24491/ Java: 64bit/jdk-11.0.3 -XX:-UseCompressedOops -XX:+UseG1GC 1 tests failed. FAILED: org.apache.solr.cloud.AliasIntegrationTest.testClusterStateProviderAPI Error Message: {} expected:<2> but was:<0> Stack Trace: java.lang.AssertionError: {} expected:<2> but was:<0> at __randomizedtesting.SeedInfo.seed([E160E4E2E451DD25:FEB778CE975A246E]:0) at org.junit.Assert.fail(Assert.java:88) at org.junit.Assert.failNotEquals(Assert.java:834) at org.junit.Assert.assertEquals(Assert.java:645) at org.apache.solr.cloud.AliasIntegrationTest.testClusterStateProviderAPI(AliasIntegrationTest.java:303) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at java.base/java.lang.Thread.run(Thread.java:834) Build Log: [...truncated 14225 lines...] [junit4] Suite: org.apache.solr.cloud.AliasIntegrationTest [junit4] 2> 1154708 INFO (SUITE-AliasIntegrationTest-seed#[E160E4E2E451DD25]-worker) [ ] o.a.s.SolrTestCaseJ4 Created dataDir:
[JENKINS] Lucene-Solr-Tests-8.x - Build # 343 - Still Unstable
Build: https://builds.apache.org/job/Lucene-Solr-Tests-8.x/343/ 1 tests failed. FAILED: org.apache.solr.security.BasicAuthIntegrationTest.testBasicAuth Error Message: Expected metric minimums for prefix SECURITY./authentication.: {failMissingCredentials=2, authenticated=20, passThrough=9, failWrongCredentials=1, requests=32, errors=0}, but got: {failMissingCredentials=2, authenticated=18, passThrough=11, totalTime=151208858, failWrongCredentials=1, requestTimes=248, requests=32, errors=0} Stack Trace: java.lang.AssertionError: Expected metric minimums for prefix SECURITY./authentication.: {failMissingCredentials=2, authenticated=20, passThrough=9, failWrongCredentials=1, requests=32, errors=0}, but got: {failMissingCredentials=2, authenticated=18, passThrough=11, totalTime=151208858, failWrongCredentials=1, requestTimes=248, requests=32, errors=0} at __randomizedtesting.SeedInfo.seed([A9403E1A7CA165E3:152E4808D8F2E699]:0) at org.junit.Assert.fail(Assert.java:88) at org.junit.Assert.assertTrue(Assert.java:41) at org.apache.solr.cloud.SolrCloudAuthTestCase.assertAuthMetricsMinimums(SolrCloudAuthTestCase.java:129) at org.apache.solr.cloud.SolrCloudAuthTestCase.assertAuthMetricsMinimums(SolrCloudAuthTestCase.java:83) at org.apache.solr.security.BasicAuthIntegrationTest.testBasicAuth(BasicAuthIntegrationTest.java:302) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) at
[jira] [Commented] (SOLR-13679) Different default Style for [explain] doctransformer registered in solrconfig.xml
[ https://issues.apache.org/jira/browse/SOLR-13679?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16899448#comment-16899448 ] Lucene/Solr QA commented on SOLR-13679: --- | (x) *{color:red}-1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || || || || || {color:brown} Prechecks {color} || | {color:green}+1{color} | {color:green} test4tests {color} | {color:green} 0m 0s{color} | {color:green} The patch appears to include 1 new or modified test files. {color} | || || || || {color:brown} master Compile Tests {color} || | {color:green}+1{color} | {color:green} compile {color} | {color:green} 2m 6s{color} | {color:green} master passed {color} | || || || || {color:brown} Patch Compile Tests {color} || | {color:green}+1{color} | {color:green} compile {color} | {color:green} 2m 6s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 2m 6s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} Release audit (RAT) {color} | {color:green} 2m 6s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} Check forbidden APIs {color} | {color:green} 2m 6s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} Validate source patterns {color} | {color:green} 2m 6s{color} | {color:green} the patch passed {color} | || || || || {color:brown} Other Tests {color} || | {color:red}-1{color} | {color:red} unit {color} | {color:red} 33m 46s{color} | {color:red} core in the patch failed. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 41m 1s{color} | {color:black} {color} | \\ \\ || Reason || Tests || | Failed junit tests | solr.search.TestPseudoReturnFields | | | solr.cloud.TestCloudPseudoReturnFields | \\ \\ || Subsystem || Report/Notes || | JIRA Issue | SOLR-13679 | | JIRA Patch URL | https://issues.apache.org/jira/secure/attachment/12976611/SOLR-13679.patch | | Optional Tests | compile javac unit ratsources checkforbiddenapis validatesourcepatterns | | uname | Linux lucene1-us-west 4.15.0-54-generic #58-Ubuntu SMP Mon Jun 24 10:55:24 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | ant | | Personality | /home/jenkins/jenkins-slave/workspace/PreCommit-SOLR-Build/sourcedir/dev-tools/test-patch/lucene-solr-yetus-personality.sh | | git revision | master / ff7b0c9de5 | | ant | version: Apache Ant(TM) version 1.10.5 compiled on March 28 2019 | | Default Java | LTS | | unit | https://builds.apache.org/job/PreCommit-SOLR-Build/520/artifact/out/patch-unit-solr_core.txt | | Test Results | https://builds.apache.org/job/PreCommit-SOLR-Build/520/testReport/ | | modules | C: solr/core U: solr/core | | Console output | https://builds.apache.org/job/PreCommit-SOLR-Build/520/console | | Powered by | Apache Yetus 0.7.0 http://yetus.apache.org | This message was automatically generated. > Different default Style for [explain] doctransformer registered in > solrconfig.xml > -- > > Key: SOLR-13679 > URL: https://issues.apache.org/jira/browse/SOLR-13679 > Project: Solr > Issue Type: Bug > Security Level: Public(Default Security Level. Issues are Public) >Reporter: Munendra S N >Assignee: Munendra S N >Priority: Minor > Attachments: SOLR-13679.patch > > > Adding explain docTransformer via solrconfig.xml > {code:java} > class="org.apache.solr.response.transform.ExplainAugmenterFactory" /> > {code} > Here, no style is specified. So, default style is used which is {{nl}}. > ExplainDocTransformer is part of defaultFactories. So, user can use this > without adding it in solrconfig.xml but when used this way, default style > used is {{text}} > Default style should be same in both cases. This behavior is same to > registering ResponseWriters in solrconfig, if content-type is not overridden > then, default content-type will used -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-13105) A visual guide to Solr Math Expressions and Streaming Expressions
[ https://issues.apache.org/jira/browse/SOLR-13105?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16899441#comment-16899441 ] ASF subversion and git services commented on SOLR-13105: Commit 67784c37c460354f589873688ea6561c05edeb8d in lucene-solr's branch refs/heads/SOLR-13105-visual from Joel Bernstein [ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=67784c3 ] SOLR-13105: Add csv viz > A visual guide to Solr Math Expressions and Streaming Expressions > - > > Key: SOLR-13105 > URL: https://issues.apache.org/jira/browse/SOLR-13105 > Project: Solr > Issue Type: New Feature >Reporter: Joel Bernstein >Assignee: Joel Bernstein >Priority: Major > Attachments: Screen Shot 2019-01-14 at 10.56.32 AM.png, Screen Shot > 2019-02-21 at 2.14.43 PM.png, Screen Shot 2019-03-03 at 2.28.35 PM.png, > Screen Shot 2019-03-04 at 7.47.57 PM.png, Screen Shot 2019-03-13 at 10.47.47 > AM.png, Screen Shot 2019-03-30 at 6.17.04 PM.png > > > Visualization is now a fundamental element of Solr Streaming Expressions and > Math Expressions. This ticket will create a visual guide to Solr Math > Expressions and Solr Streaming Expressions that includes *Apache Zeppelin* > visualization examples. > It will also cover using the JDBC expression to *analyze* and *visualize* > results from any JDBC compliant data source. > Intro from the guide: > {code:java} > Streaming Expressions exposes the capabilities of Solr Cloud as composable > functions. These functions provide a system for searching, transforming, > analyzing and visualizing data stored in Solr Cloud collections. > At a high level there are four main capabilities that will be explored in the > documentation: > * Searching, sampling and aggregating results from Solr. > * Transforming result sets after they are retrieved from Solr. > * Analyzing and modeling result sets using probability and statistics and > machine learning libraries. > * Visualizing result sets, aggregations and statistical models of the data. > {code} > > A few sample visualizations are attached to the ticket. -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-13105) A visual guide to Solr Math Expressions and Streaming Expressions
[ https://issues.apache.org/jira/browse/SOLR-13105?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16899439#comment-16899439 ] ASF subversion and git services commented on SOLR-13105: Commit ec221085a831d60355042ab6b01f50bf072e0fcd in lucene-solr's branch refs/heads/SOLR-13105-visual from Joel Bernstein [ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=ec22108 ] SOLR-13105: Change TOC > A visual guide to Solr Math Expressions and Streaming Expressions > - > > Key: SOLR-13105 > URL: https://issues.apache.org/jira/browse/SOLR-13105 > Project: Solr > Issue Type: New Feature >Reporter: Joel Bernstein >Assignee: Joel Bernstein >Priority: Major > Attachments: Screen Shot 2019-01-14 at 10.56.32 AM.png, Screen Shot > 2019-02-21 at 2.14.43 PM.png, Screen Shot 2019-03-03 at 2.28.35 PM.png, > Screen Shot 2019-03-04 at 7.47.57 PM.png, Screen Shot 2019-03-13 at 10.47.47 > AM.png, Screen Shot 2019-03-30 at 6.17.04 PM.png > > > Visualization is now a fundamental element of Solr Streaming Expressions and > Math Expressions. This ticket will create a visual guide to Solr Math > Expressions and Solr Streaming Expressions that includes *Apache Zeppelin* > visualization examples. > It will also cover using the JDBC expression to *analyze* and *visualize* > results from any JDBC compliant data source. > Intro from the guide: > {code:java} > Streaming Expressions exposes the capabilities of Solr Cloud as composable > functions. These functions provide a system for searching, transforming, > analyzing and visualizing data stored in Solr Cloud collections. > At a high level there are four main capabilities that will be explored in the > documentation: > * Searching, sampling and aggregating results from Solr. > * Transforming result sets after they are retrieved from Solr. > * Analyzing and modeling result sets using probability and statistics and > machine learning libraries. > * Visualizing result sets, aggregations and statistical models of the data. > {code} > > A few sample visualizations are attached to the ticket. -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-13105) A visual guide to Solr Math Expressions and Streaming Expressions
[ https://issues.apache.org/jira/browse/SOLR-13105?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16899438#comment-16899438 ] ASF subversion and git services commented on SOLR-13105: Commit fd2e8400402cbd2d86f4ce05b84fbb49612a0f17 in lucene-solr's branch refs/heads/SOLR-13105-visual from Joel Bernstein [ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=fd2e840 ] SOLR-13105: Add Data Loading > A visual guide to Solr Math Expressions and Streaming Expressions > - > > Key: SOLR-13105 > URL: https://issues.apache.org/jira/browse/SOLR-13105 > Project: Solr > Issue Type: New Feature >Reporter: Joel Bernstein >Assignee: Joel Bernstein >Priority: Major > Attachments: Screen Shot 2019-01-14 at 10.56.32 AM.png, Screen Shot > 2019-02-21 at 2.14.43 PM.png, Screen Shot 2019-03-03 at 2.28.35 PM.png, > Screen Shot 2019-03-04 at 7.47.57 PM.png, Screen Shot 2019-03-13 at 10.47.47 > AM.png, Screen Shot 2019-03-30 at 6.17.04 PM.png > > > Visualization is now a fundamental element of Solr Streaming Expressions and > Math Expressions. This ticket will create a visual guide to Solr Math > Expressions and Solr Streaming Expressions that includes *Apache Zeppelin* > visualization examples. > It will also cover using the JDBC expression to *analyze* and *visualize* > results from any JDBC compliant data source. > Intro from the guide: > {code:java} > Streaming Expressions exposes the capabilities of Solr Cloud as composable > functions. These functions provide a system for searching, transforming, > analyzing and visualizing data stored in Solr Cloud collections. > At a high level there are four main capabilities that will be explored in the > documentation: > * Searching, sampling and aggregating results from Solr. > * Transforming result sets after they are retrieved from Solr. > * Analyzing and modeling result sets using probability and statistics and > machine learning libraries. > * Visualizing result sets, aggregations and statistical models of the data. > {code} > > A few sample visualizations are attached to the ticket. -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-8764) Add "export all terms" feature to Luke
[ https://issues.apache.org/jira/browse/LUCENE-8764?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16899432#comment-16899432 ] Tomoko Uchida commented on LUCENE-8764: --- FYI, I opened a follow-up issue LUCENE-8945. > Add "export all terms" feature to Luke > -- > > Key: LUCENE-8764 > URL: https://issues.apache.org/jira/browse/LUCENE-8764 > Project: Lucene - Core > Issue Type: Improvement > Components: modules/luke >Reporter: Tomoko Uchida >Assignee: Tomoko Uchida >Priority: Major > Labels: beginner > Fix For: master (9.0), 8.3 > > Attachments: LUCENE-8764.patch, LUCENE-8764.patch, LUCENE-8764.patch, > LUCENE-8764.patch, Screenshot 2019-07-23 12.29.06.png, Screenshot 2019-07-24 > 12.35.48.png, Screenshot 2019-07-24 12.36.00.png, Screenshot 2019-07-24 > 12.36.27.png, Screenshot 2019-07-25 13.20.40.png, Screenshot 2019-07-25 > 13.20.48.png, Screenshot 2019-07-25 13.21.03.png, Screenshot 2019-07-25 > 13.25.23.png > > > This is a migrated issue from previous Luke project in GitHub: > [https://github.com/DmitryKey/luke/issues/3] (There are users' requests so I > moved this from GitHub to Jira) > You can browse terms in arbitrary field via Luke GUI, but in some cases > "exporting all terms (and optionally docids) to a file" feature would be > useful for further inspection. It might be similar to Solr's terms component. > As for the user interface, "Export terms" button should be located in > Overview tab and/or Documents tab. > -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Created] (LUCENE-8945) Allow to change the output file delimiter on Luke "export terms" feature
Tomoko Uchida created LUCENE-8945: - Summary: Allow to change the output file delimiter on Luke "export terms" feature Key: LUCENE-8945 URL: https://issues.apache.org/jira/browse/LUCENE-8945 Project: Lucene - Core Issue Type: Improvement Components: modules/luke Reporter: Tomoko Uchida This is a follow-up issue for LUCENE-8764. Current delimiter is fixed to "," (comma), but terms also can include comma and they are not escaped. It would be better if the delimiter can be changed/selected to a tab or whitespace when exporting. -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Resolved] (LUCENE-8764) Add "export all terms" feature to Luke
[ https://issues.apache.org/jira/browse/LUCENE-8764?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Tomoko Uchida resolved LUCENE-8764. --- Resolution: Fixed Assignee: Tomoko Uchida Fix Version/s: 8.3 master (9.0) > Add "export all terms" feature to Luke > -- > > Key: LUCENE-8764 > URL: https://issues.apache.org/jira/browse/LUCENE-8764 > Project: Lucene - Core > Issue Type: Improvement > Components: modules/luke >Reporter: Tomoko Uchida >Assignee: Tomoko Uchida >Priority: Major > Labels: beginner > Fix For: master (9.0), 8.3 > > Attachments: LUCENE-8764.patch, LUCENE-8764.patch, LUCENE-8764.patch, > LUCENE-8764.patch, Screenshot 2019-07-23 12.29.06.png, Screenshot 2019-07-24 > 12.35.48.png, Screenshot 2019-07-24 12.36.00.png, Screenshot 2019-07-24 > 12.36.27.png, Screenshot 2019-07-25 13.20.40.png, Screenshot 2019-07-25 > 13.20.48.png, Screenshot 2019-07-25 13.21.03.png, Screenshot 2019-07-25 > 13.25.23.png > > > This is a migrated issue from previous Luke project in GitHub: > [https://github.com/DmitryKey/luke/issues/3] (There are users' requests so I > moved this from GitHub to Jira) > You can browse terms in arbitrary field via Luke GUI, but in some cases > "exporting all terms (and optionally docids) to a file" feature would be > useful for further inspection. It might be similar to Solr's terms component. > As for the user interface, "Export terms" button should be located in > Overview tab and/or Documents tab. > -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (LUCENE-8764) Add "export all terms" feature to Luke
[ https://issues.apache.org/jira/browse/LUCENE-8764?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Tomoko Uchida updated LUCENE-8764: -- Attachment: LUCENE-8764.patch > Add "export all terms" feature to Luke > -- > > Key: LUCENE-8764 > URL: https://issues.apache.org/jira/browse/LUCENE-8764 > Project: Lucene - Core > Issue Type: Improvement > Components: modules/luke >Reporter: Tomoko Uchida >Priority: Major > Labels: beginner > Attachments: LUCENE-8764.patch, LUCENE-8764.patch, LUCENE-8764.patch, > LUCENE-8764.patch, Screenshot 2019-07-23 12.29.06.png, Screenshot 2019-07-24 > 12.35.48.png, Screenshot 2019-07-24 12.36.00.png, Screenshot 2019-07-24 > 12.36.27.png, Screenshot 2019-07-25 13.20.40.png, Screenshot 2019-07-25 > 13.20.48.png, Screenshot 2019-07-25 13.21.03.png, Screenshot 2019-07-25 > 13.25.23.png > > > This is a migrated issue from previous Luke project in GitHub: > [https://github.com/DmitryKey/luke/issues/3] (There are users' requests so I > moved this from GitHub to Jira) > You can browse terms in arbitrary field via Luke GUI, but in some cases > "exporting all terms (and optionally docids) to a file" feature would be > useful for further inspection. It might be similar to Solr's terms component. > As for the user interface, "Export terms" button should be located in > Overview tab and/or Documents tab. > -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-8764) Add "export all terms" feature to Luke
[ https://issues.apache.org/jira/browse/LUCENE-8764?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16899429#comment-16899429 ] Tomoko Uchida commented on LUCENE-8764: --- Here is the final patch: [^LUCENE-8764.patch] > Add "export all terms" feature to Luke > -- > > Key: LUCENE-8764 > URL: https://issues.apache.org/jira/browse/LUCENE-8764 > Project: Lucene - Core > Issue Type: Improvement > Components: modules/luke >Reporter: Tomoko Uchida >Priority: Major > Labels: beginner > Attachments: LUCENE-8764.patch, LUCENE-8764.patch, LUCENE-8764.patch, > LUCENE-8764.patch, Screenshot 2019-07-23 12.29.06.png, Screenshot 2019-07-24 > 12.35.48.png, Screenshot 2019-07-24 12.36.00.png, Screenshot 2019-07-24 > 12.36.27.png, Screenshot 2019-07-25 13.20.40.png, Screenshot 2019-07-25 > 13.20.48.png, Screenshot 2019-07-25 13.21.03.png, Screenshot 2019-07-25 > 13.25.23.png > > > This is a migrated issue from previous Luke project in GitHub: > [https://github.com/DmitryKey/luke/issues/3] (There are users' requests so I > moved this from GitHub to Jira) > You can browse terms in arbitrary field via Luke GUI, but in some cases > "exporting all terms (and optionally docids) to a file" feature would be > useful for further inspection. It might be similar to Solr's terms component. > As for the user interface, "Export terms" button should be located in > Overview tab and/or Documents tab. > -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-8764) Add "export all terms" feature to Luke
[ https://issues.apache.org/jira/browse/LUCENE-8764?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16899427#comment-16899427 ] Tomoko Uchida commented on LUCENE-8764: --- I made small changes to the patch: - Fix precommit failures - Improve error handlings - Improve messages, adjust component layouts in the GUI The revised patch was committed on the master and 8x branch. Thanks [~lmenezes]! > Add "export all terms" feature to Luke > -- > > Key: LUCENE-8764 > URL: https://issues.apache.org/jira/browse/LUCENE-8764 > Project: Lucene - Core > Issue Type: Improvement > Components: modules/luke >Reporter: Tomoko Uchida >Priority: Major > Labels: beginner > Attachments: LUCENE-8764.patch, LUCENE-8764.patch, LUCENE-8764.patch, > Screenshot 2019-07-23 12.29.06.png, Screenshot 2019-07-24 12.35.48.png, > Screenshot 2019-07-24 12.36.00.png, Screenshot 2019-07-24 12.36.27.png, > Screenshot 2019-07-25 13.20.40.png, Screenshot 2019-07-25 13.20.48.png, > Screenshot 2019-07-25 13.21.03.png, Screenshot 2019-07-25 13.25.23.png > > > This is a migrated issue from previous Luke project in GitHub: > [https://github.com/DmitryKey/luke/issues/3] (There are users' requests so I > moved this from GitHub to Jira) > You can browse terms in arbitrary field via Luke GUI, but in some cases > "exporting all terms (and optionally docids) to a file" feature would be > useful for further inspection. It might be similar to Solr's terms component. > As for the user interface, "Export terms" button should be located in > Overview tab and/or Documents tab. > -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-8764) Add "export all terms" feature to Luke
[ https://issues.apache.org/jira/browse/LUCENE-8764?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16899423#comment-16899423 ] ASF subversion and git services commented on LUCENE-8764: - Commit b4ef1b279c1f831294aab8a24e6bcb0279f9402f in lucene-solr's branch refs/heads/branch_8x from Leonardo Menezes [ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=b4ef1b2 ] LUCENE-8764: Add "export all terms" feature to Luke Co-authored-by: Tomoko Uchida > Add "export all terms" feature to Luke > -- > > Key: LUCENE-8764 > URL: https://issues.apache.org/jira/browse/LUCENE-8764 > Project: Lucene - Core > Issue Type: Improvement > Components: modules/luke >Reporter: Tomoko Uchida >Priority: Major > Labels: beginner > Attachments: LUCENE-8764.patch, LUCENE-8764.patch, LUCENE-8764.patch, > Screenshot 2019-07-23 12.29.06.png, Screenshot 2019-07-24 12.35.48.png, > Screenshot 2019-07-24 12.36.00.png, Screenshot 2019-07-24 12.36.27.png, > Screenshot 2019-07-25 13.20.40.png, Screenshot 2019-07-25 13.20.48.png, > Screenshot 2019-07-25 13.21.03.png, Screenshot 2019-07-25 13.25.23.png > > > This is a migrated issue from previous Luke project in GitHub: > [https://github.com/DmitryKey/luke/issues/3] (There are users' requests so I > moved this from GitHub to Jira) > You can browse terms in arbitrary field via Luke GUI, but in some cases > "exporting all terms (and optionally docids) to a file" feature would be > useful for further inspection. It might be similar to Solr's terms component. > As for the user interface, "Export terms" button should be located in > Overview tab and/or Documents tab. > -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-8764) Add "export all terms" feature to Luke
[ https://issues.apache.org/jira/browse/LUCENE-8764?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16899422#comment-16899422 ] ASF subversion and git services commented on LUCENE-8764: - Commit ff7b0c9de5ad41b18fe6c57c699be63a472eeb92 in lucene-solr's branch refs/heads/master from Leonardo Menezes [ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=ff7b0c9 ] LUCENE-8764: Add "export all terms" feature to Luke Co-authored-by: Tomoko Uchida > Add "export all terms" feature to Luke > -- > > Key: LUCENE-8764 > URL: https://issues.apache.org/jira/browse/LUCENE-8764 > Project: Lucene - Core > Issue Type: Improvement > Components: modules/luke >Reporter: Tomoko Uchida >Priority: Major > Labels: beginner > Attachments: LUCENE-8764.patch, LUCENE-8764.patch, LUCENE-8764.patch, > Screenshot 2019-07-23 12.29.06.png, Screenshot 2019-07-24 12.35.48.png, > Screenshot 2019-07-24 12.36.00.png, Screenshot 2019-07-24 12.36.27.png, > Screenshot 2019-07-25 13.20.40.png, Screenshot 2019-07-25 13.20.48.png, > Screenshot 2019-07-25 13.21.03.png, Screenshot 2019-07-25 13.25.23.png > > > This is a migrated issue from previous Luke project in GitHub: > [https://github.com/DmitryKey/luke/issues/3] (There are users' requests so I > moved this from GitHub to Jira) > You can browse terms in arbitrary field via Luke GUI, but in some cases > "exporting all terms (and optionally docids) to a file" feature would be > useful for further inspection. It might be similar to Solr's terms component. > As for the user interface, "Export terms" button should be located in > Overview tab and/or Documents tab. > -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Assigned] (SOLR-13679) Different default Style for [explain] doctransformer registered in solrconfig.xml
[ https://issues.apache.org/jira/browse/SOLR-13679?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Munendra S N reassigned SOLR-13679: --- Assignee: Munendra S N > Different default Style for [explain] doctransformer registered in > solrconfig.xml > -- > > Key: SOLR-13679 > URL: https://issues.apache.org/jira/browse/SOLR-13679 > Project: Solr > Issue Type: Bug > Security Level: Public(Default Security Level. Issues are Public) >Reporter: Munendra S N >Assignee: Munendra S N >Priority: Minor > Attachments: SOLR-13679.patch > > > Adding explain docTransformer via solrconfig.xml > {code:java} > class="org.apache.solr.response.transform.ExplainAugmenterFactory" /> > {code} > Here, no style is specified. So, default style is used which is {{nl}}. > ExplainDocTransformer is part of defaultFactories. So, user can use this > without adding it in solrconfig.xml but when used this way, default style > used is {{text}} > Default style should be same in both cases. This behavior is same to > registering ResponseWriters in solrconfig, if content-type is not overridden > then, default content-type will used -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-13679) Different default Style for [explain] doctransformer registered in solrconfig.xml
[ https://issues.apache.org/jira/browse/SOLR-13679?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16899420#comment-16899420 ] Munendra S N commented on SOLR-13679: - [^SOLR-13679.patch] Default Style for ExplainAugumenter is set to {{nl}}. This could have backward compatibility issues. So, will be pushing only to master > Different default Style for [explain] doctransformer registered in > solrconfig.xml > -- > > Key: SOLR-13679 > URL: https://issues.apache.org/jira/browse/SOLR-13679 > Project: Solr > Issue Type: Bug > Security Level: Public(Default Security Level. Issues are Public) >Reporter: Munendra S N >Priority: Minor > Attachments: SOLR-13679.patch > > > Adding explain docTransformer via solrconfig.xml > {code:java} > class="org.apache.solr.response.transform.ExplainAugmenterFactory" /> > {code} > Here, no style is specified. So, default style is used which is {{nl}}. > ExplainDocTransformer is part of defaultFactories. So, user can use this > without adding it in solrconfig.xml but when used this way, default style > used is {{text}} > Default style should be same in both cases. This behavior is same to > registering ResponseWriters in solrconfig, if content-type is not overridden > then, default content-type will used -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (SOLR-13679) Different default Style for [explain] doctransformer registered in solrconfig.xml
[ https://issues.apache.org/jira/browse/SOLR-13679?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Munendra S N updated SOLR-13679: Status: Patch Available (was: Open) > Different default Style for [explain] doctransformer registered in > solrconfig.xml > -- > > Key: SOLR-13679 > URL: https://issues.apache.org/jira/browse/SOLR-13679 > Project: Solr > Issue Type: Bug > Security Level: Public(Default Security Level. Issues are Public) >Reporter: Munendra S N >Priority: Minor > Attachments: SOLR-13679.patch > > > Adding explain docTransformer via solrconfig.xml > {code:java} > class="org.apache.solr.response.transform.ExplainAugmenterFactory" /> > {code} > Here, no style is specified. So, default style is used which is {{nl}}. > ExplainDocTransformer is part of defaultFactories. So, user can use this > without adding it in solrconfig.xml but when used this way, default style > used is {{text}} > Default style should be same in both cases. This behavior is same to > registering ResponseWriters in solrconfig, if content-type is not overridden > then, default content-type will used -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (SOLR-13679) Different default Style for [explain] doctransformer registered in solrconfig.xml
[ https://issues.apache.org/jira/browse/SOLR-13679?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Munendra S N updated SOLR-13679: Attachment: SOLR-13679.patch > Different default Style for [explain] doctransformer registered in > solrconfig.xml > -- > > Key: SOLR-13679 > URL: https://issues.apache.org/jira/browse/SOLR-13679 > Project: Solr > Issue Type: Bug > Security Level: Public(Default Security Level. Issues are Public) >Reporter: Munendra S N >Priority: Minor > Attachments: SOLR-13679.patch > > > Adding explain docTransformer via solrconfig.xml > {code:java} > class="org.apache.solr.response.transform.ExplainAugmenterFactory" /> > {code} > Here, no style is specified. So, default style is used which is {{nl}}. > ExplainDocTransformer is part of defaultFactories. So, user can use this > without adding it in solrconfig.xml but when used this way, default style > used is {{text}} > Default style should be same in both cases. This behavior is same to > registering ResponseWriters in solrconfig, if content-type is not overridden > then, default content-type will used -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[GitHub] [lucene-solr] munendrasn commented on a change in pull request #819: SOLR-13676: Reduce log verbosity in TestDistributedGrouping
munendrasn commented on a change in pull request #819: SOLR-13676: Reduce log verbosity in TestDistributedGrouping URL: https://github.com/apache/lucene-solr/pull/819#discussion_r310343627 ## File path: solr/core/src/test/org/apache/solr/TestDistributedGrouping.java ## @@ -29,8 +29,11 @@ import org.apache.solr.common.params.ModifiableSolrParams; import org.apache.solr.common.util.NamedList; import org.apache.solr.SolrTestCaseJ4.SuppressPointFields; +import org.junit.Assert; Review comment: I think we don't need this import. All test cases extend `Assert` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[GitHub] [lucene-solr] munendrasn commented on a change in pull request #819: SOLR-13676: Reduce log verbosity in TestDistributedGrouping
munendrasn commented on a change in pull request #819: SOLR-13676: Reduce log verbosity in TestDistributedGrouping URL: https://github.com/apache/lucene-solr/pull/819#discussion_r310343648 ## File path: solr/core/src/test/org/apache/solr/TestDistributedGrouping.java ## @@ -221,12 +224,13 @@ public void test() throws Exception { "fl", "id", "group.format", "grouped", "group.limit", "-12", "sort", i1 + " asc, id asc"); +ignoreException("'group.offset' parameter cannot be negative"); Review comment: If we are doing this, shouldn't we call `resetExceptionIgnores()` after exception verification? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Assigned] (SOLR-13676) Reduce log verbosity in TestDistributedGrouping using ignoreException
[ https://issues.apache.org/jira/browse/SOLR-13676?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Munendra S N reassigned SOLR-13676: --- Assignee: Munendra S N > Reduce log verbosity in TestDistributedGrouping using ignoreException > - > > Key: SOLR-13676 > URL: https://issues.apache.org/jira/browse/SOLR-13676 > Project: Solr > Issue Type: Improvement > Security Level: Public(Default Security Level. Issues are Public) > Components: SolrCloud >Reporter: Diego Ceccarelli >Assignee: Munendra S N >Priority: Minor > Time Spent: 0.5h > Remaining Estimate: 0h > > SOLR-13404 added a test that expects Solr to fail if grouping is called with > {{group.offset < 0}}. When the test runs it succeeds but the whole stack > trace is printed out in the logs. This small patch avoid the stack trace by > using {{ignoreException}}. -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-12555) Replace try-fail-catch test patterns
[ https://issues.apache.org/jira/browse/SOLR-12555?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16899407#comment-16899407 ] ASF subversion and git services commented on SOLR-12555: Commit 488c75fb555cdf704cf93b66cc0fdafe2896d159 in lucene-solr's branch refs/heads/branch_8x from Munendra S N [ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=488c75f ] SOLR-12555: use expectThrows() to verify the ex thrown in tests > Replace try-fail-catch test patterns > > > Key: SOLR-12555 > URL: https://issues.apache.org/jira/browse/SOLR-12555 > Project: Solr > Issue Type: Test > Components: Tests >Affects Versions: 8.0 >Reporter: Jason Gerlowski >Assignee: Jason Gerlowski >Priority: Trivial > Attachments: SOLR-12555-sorted-by-package.txt, SOLR-12555.patch, > SOLR-12555.patch, SOLR-12555.patch, SOLR-12555.patch, SOLR-12555.patch, > SOLR-12555.txt > > Time Spent: 4h 20m > Remaining Estimate: 0h > > I recently added some test code through SOLR-12427 which used the following > test anti-pattern: > {code} > try { > actionExpectedToThrowException(); > fail("I expected this to throw an exception, but it didn't"); > catch (Exception e) { > assertOnThrownException(e); > } > {code} > Hoss (rightfully) objected that this should instead be written using the > formulation below, which is clearer and more concise. > {code} > SolrException e = expectThrows(() -> {...}); > {code} > We should remove many of these older formulations where it makes sense. Many > of them were written before {{expectThrows}} was introduced, and having the > old style assertions around makes it easier for them to continue creeping in. -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-12555) Replace try-fail-catch test patterns
[ https://issues.apache.org/jira/browse/SOLR-12555?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16899404#comment-16899404 ] ASF subversion and git services commented on SOLR-12555: Commit 8c4fde94fe93c70b95ed3563fb65972bb303e0af in lucene-solr's branch refs/heads/master from Munendra S N [ https://gitbox.apache.org/repos/asf?p=lucene-solr.git;h=8c4fde9 ] SOLR-12555: use expectThrows() to verify the ex thrown in tests > Replace try-fail-catch test patterns > > > Key: SOLR-12555 > URL: https://issues.apache.org/jira/browse/SOLR-12555 > Project: Solr > Issue Type: Test > Components: Tests >Affects Versions: 8.0 >Reporter: Jason Gerlowski >Assignee: Jason Gerlowski >Priority: Trivial > Attachments: SOLR-12555-sorted-by-package.txt, SOLR-12555.patch, > SOLR-12555.patch, SOLR-12555.patch, SOLR-12555.patch, SOLR-12555.patch, > SOLR-12555.txt > > Time Spent: 4h 20m > Remaining Estimate: 0h > > I recently added some test code through SOLR-12427 which used the following > test anti-pattern: > {code} > try { > actionExpectedToThrowException(); > fail("I expected this to throw an exception, but it didn't"); > catch (Exception e) { > assertOnThrownException(e); > } > {code} > Hoss (rightfully) objected that this should instead be written using the > formulation below, which is clearer and more concise. > {code} > SolrException e = expectThrows(() -> {...}); > {code} > We should remove many of these older formulations where it makes sense. Many > of them were written before {{expectThrows}} was introduced, and having the > old style assertions around makes it easier for them to continue creeping in. -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Created] (SOLR-13679) Different default Style for [explain] doctransformer registered in solrconfig.xml
Munendra S N created SOLR-13679: --- Summary: Different default Style for [explain] doctransformer registered in solrconfig.xml Key: SOLR-13679 URL: https://issues.apache.org/jira/browse/SOLR-13679 Project: Solr Issue Type: Bug Security Level: Public (Default Security Level. Issues are Public) Reporter: Munendra S N Adding explain docTransformer via solrconfig.xml {code:java} {code} Here, no style is specified. So, default style is used which is {{nl}}. ExplainDocTransformer is part of defaultFactories. So, user can use this without adding it in solrconfig.xml but when used this way, default style used is {{text}} Default style should be same in both cases. This behavior is same to registering ResponseWriters in solrconfig, if content-type is not overridden then, default content-type will used -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[GitHub] [lucene-solr] MarcusSorealheis commented on a change in pull request #805: SOLR-13649 change the default behavior of the basic authentication plugin.
MarcusSorealheis commented on a change in pull request #805: SOLR-13649 change the default behavior of the basic authentication plugin. URL: https://github.com/apache/lucene-solr/pull/805#discussion_r310342305 ## File path: solr/CHANGES.txt ## @@ -57,6 +57,8 @@ Upgrade Notes * SOLR-13596: Deprecated GroupingSpecification methods are removed. (Munendra S N) +* SOLR-13649: When Basic Authentication is enabled, users will be required to enter credentials to access the Admin UI and associated operations by default. The blockUnknown parameter can still be set to false to disable the need to authenticate. (marcussorealheis) Review comment: ```{ "authentication": { "class":"solr.JWTAuthPlugin", "blockUnknown":"false" } } ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[GitHub] [lucene-solr] MarcusSorealheis commented on a change in pull request #805: SOLR-13649 change the default behavior of the basic authentication plugin.
MarcusSorealheis commented on a change in pull request #805: SOLR-13649 change the default behavior of the basic authentication plugin. URL: https://github.com/apache/lucene-solr/pull/805#discussion_r310342269 ## File path: solr/CHANGES.txt ## @@ -57,6 +57,8 @@ Upgrade Notes * SOLR-13596: Deprecated GroupingSpecification methods are removed. (Munendra S N) +* SOLR-13649: When Basic Authentication is enabled, users will be required to enter credentials to access the Admin UI and associated operations by default. The blockUnknown parameter can still be set to false to disable the need to authenticate. (marcussorealheis) Review comment: > Gave some concrete comments. But there are many many more mentions of `blockUnknown` in the codebase. You should consider each and every one in light of the change. There should also be a unit test that asserts that the default is now true. > > One example of a place that also needs change is https://github.com/apache/lucene-solr/blob/master/solr/core/src/java/org/apache/solr/util/SolrCLI.java#L4413 but there are probably many more. > > Related, I think we also should change the default and docs for `JWTAuthPlugin` to align with the new expectations: > > We could also consider whether this special case security.json should still default to false or alternatively generate an ERROR instead of blocking everything, since it has no users at all: > > ``` > "authentication": {"class":"solr.BasicAuthPlugin"} > ``` My Strategy today was to simply add the parameter to the docs for JWT rather than changing its functionality. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-Tests-8.x - Build # 342 - Unstable
Build: https://builds.apache.org/job/Lucene-Solr-Tests-8.x/342/ 3 tests failed. FAILED: org.apache.solr.cloud.CollectionPropsTest.testWatcher Error Message: Test abandoned because suite timeout was reached. Stack Trace: java.lang.Exception: Test abandoned because suite timeout was reached. at __randomizedtesting.SeedInfo.seed([579F069E9DB0AD06]:0) FAILED: junit.framework.TestSuite.org.apache.solr.cloud.CollectionPropsTest Error Message: Suite timeout exceeded (>= 720 msec). Stack Trace: java.lang.Exception: Suite timeout exceeded (>= 720 msec). at __randomizedtesting.SeedInfo.seed([579F069E9DB0AD06]:0) FAILED: org.apache.solr.cloud.rule.RulesTest.doIntegrationTest Error Message: Should have found shard1 w/2active replicas + shard2 w/1active replica Timeout waiting to see state for collection=rulesColl :DocCollection(rulesColl//collections/rulesColl/state.json/17)={ "pullReplicas":"0", "replicationFactor":"2", "shards":{ "shard1":{ "range":null, "state":"active", "replicas":{ "core_node2":{ "core":"rulesColl_shard1_replica_n1", "base_url":"http://127.0.0.1:41471/solr;, "node_name":"127.0.0.1:41471_solr", "state":"active", "type":"NRT", "force_set_state":"false"}, "core_node4":{ "core":"rulesColl_shard1_replica_n3", "base_url":"http://127.0.0.1:39984/solr;, "node_name":"127.0.0.1:39984_solr", "state":"active", "type":"NRT", "force_set_state":"false", "leader":"true"}}}, "shard2":{ "range":null, "state":"active", "replicas":{ "core_node7":{ "core":"rulesColl_shard2_replica_n5", "base_url":"http://127.0.0.1:34487/solr;, "node_name":"127.0.0.1:34487_solr", "state":"active", "type":"NRT", "force_set_state":"false", "leader":"true"}, "core_node8":{ "core":"rulesColl_shard2_replica_n6", "base_url":"http://127.0.0.1:33318/solr;, "node_name":"127.0.0.1:33318_solr", "state":"active", "type":"NRT", "force_set_state":"false"}, "core_node10":{ "core":"rulesColl_shard2_replica_n9", "base_url":"http://127.0.0.1:39587/solr;, "node_name":"127.0.0.1:39587_solr", "state":"active", "type":"NRT", "force_set_state":"false", "router":{"name":"implicit"}, "maxShardsPerNode":"1", "autoAddReplicas":"false", "snitch":[{"class":"ImplicitSnitch"}], "nrtReplicas":"2", "tlogReplicas":"0", "rule":[ {"cores":"<4"}, { "node":"*", "replica":"<2"}, {"freedisk":">0"}]} Live Nodes: [127.0.0.1:33318_solr, 127.0.0.1:34487_solr, 127.0.0.1:39587_solr, 127.0.0.1:39984_solr, 127.0.0.1:41471_solr] Last available state: DocCollection(rulesColl//collections/rulesColl/state.json/17)={ "pullReplicas":"0", "replicationFactor":"2", "shards":{ "shard1":{ "range":null, "state":"active", "replicas":{ "core_node2":{ "core":"rulesColl_shard1_replica_n1", "base_url":"http://127.0.0.1:41471/solr;, "node_name":"127.0.0.1:41471_solr", "state":"active", "type":"NRT", "force_set_state":"false"}, "core_node4":{ "core":"rulesColl_shard1_replica_n3", "base_url":"http://127.0.0.1:39984/solr;, "node_name":"127.0.0.1:39984_solr", "state":"active", "type":"NRT", "force_set_state":"false", "leader":"true"}}}, "shard2":{ "range":null, "state":"active", "replicas":{ "core_node7":{ "core":"rulesColl_shard2_replica_n5", "base_url":"http://127.0.0.1:34487/solr;, "node_name":"127.0.0.1:34487_solr", "state":"active", "type":"NRT", "force_set_state":"false", "leader":"true"}, "core_node8":{ "core":"rulesColl_shard2_replica_n6", "base_url":"http://127.0.0.1:33318/solr;, "node_name":"127.0.0.1:33318_solr", "state":"active", "type":"NRT", "force_set_state":"false"}, "core_node10":{ "core":"rulesColl_shard2_replica_n9", "base_url":"http://127.0.0.1:39587/solr;, "node_name":"127.0.0.1:39587_solr", "state":"active", "type":"NRT", "force_set_state":"false", "router":{"name":"implicit"}, "maxShardsPerNode":"1", "autoAddReplicas":"false", "snitch":[{"class":"ImplicitSnitch"}], "nrtReplicas":"2", "tlogReplicas":"0", "rule":[ {"cores":"<4"}, { "node":"*", "replica":"<2"}, {"freedisk":">0"}]} Stack Trace: java.lang.AssertionError: Should have found shard1 w/2active replicas + shard2 w/1active replica Timeout