[jira] [Commented] (LUCENE-5098) Broadword bit selection
[ https://issues.apache.org/jira/browse/LUCENE-5098?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706697#comment-13706697 ] Dawid Weiss commented on LUCENE-5098: - bq. I don't really like using LuceneTestCase here because the tests here pass when extending TestCase and they do not use anything that LuceneTestCase provides If you prefer not to have that discussion here I will shut up. I just wanted to clarify that LTC does checks on your test that you bypass when you extend from TestCase. Randomization of method order (catches order dependencies that shouldn't be there), ensuring your tests run with assertions enabled, checking if your threads don't spawn extra threads -- all these (and more) may not matter to you because you're a seasoned engineer but they do matter in general when contributions come from various sources. Randomization isn't the only goal of LTC. In short: it's not a remark to you personally, it's a remark to everyone in general that extending LTC should be a priority because it will catch faulty tests faster. Broadword bit selection --- Key: LUCENE-5098 URL: https://issues.apache.org/jira/browse/LUCENE-5098 Project: Lucene - Core Issue Type: Improvement Components: core/other Reporter: Paul Elschot Assignee: Adrien Grand Priority: Minor Attachments: LUCENE-5098.patch -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Comment Edited] (LUCENE-5098) Broadword bit selection
[ https://issues.apache.org/jira/browse/LUCENE-5098?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706697#comment-13706697 ] Dawid Weiss edited comment on LUCENE-5098 at 7/12/13 6:35 AM: -- bq. I don't really like using LuceneTestCase here because the tests here pass when extending TestCase and they do not use anything that LuceneTestCase provides If you prefer not to have that discussion here I will shut up. I just wanted to clarify that LTC does checks on your test that you bypass when you extend from TestCase. Randomization of method order (catches order dependencies that shouldn't be there), ensuring your tests run with assertions enabled, checking if your tests don't spawn extra threads -- all these (and more) may not matter to you because you're a seasoned engineer but they do matter in general when contributions come from various sources (or are refactored later by people other than the original author). Randomization isn't the only goal of LTC. In short: it's not a remark to you personally, it's a remark to everyone in general that extending LTC should be a priority because it will catch faulty tests faster. was (Author: dweiss): bq. I don't really like using LuceneTestCase here because the tests here pass when extending TestCase and they do not use anything that LuceneTestCase provides If you prefer not to have that discussion here I will shut up. I just wanted to clarify that LTC does checks on your test that you bypass when you extend from TestCase. Randomization of method order (catches order dependencies that shouldn't be there), ensuring your tests run with assertions enabled, checking if your threads don't spawn extra threads -- all these (and more) may not matter to you because you're a seasoned engineer but they do matter in general when contributions come from various sources. Randomization isn't the only goal of LTC. In short: it's not a remark to you personally, it's a remark to everyone in general that extending LTC should be a priority because it will catch faulty tests faster. Broadword bit selection --- Key: LUCENE-5098 URL: https://issues.apache.org/jira/browse/LUCENE-5098 Project: Lucene - Core Issue Type: Improvement Components: core/other Reporter: Paul Elschot Assignee: Adrien Grand Priority: Minor Attachments: LUCENE-5098.patch -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (LUCENE-3069) Lucene should have an entirely memory resident term dictionary
[ https://issues.apache.org/jira/browse/LUCENE-3069?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Han Jiang updated LUCENE-3069: -- Attachment: example.png LUCENE-3069.patch Uploaded patch, it is the main part of changes I commited to branch3069. The picture shows current impl of outputs (it is fetched from one field in wikimedium5k). * long[] (sortable metadata) * byte[] (unsortable, generic metadata) * df, ttf (term stats) A single byte flag is used to indicate whether/which fields current outputs maintains, for PBF with short byte[], this should be enough. Also, for long-tail terms, the totalTermFreq an safely be inlined into docFreq (for body field in wikimedium1m, 85.8% terms have df == ttf). Since TermsEnum is totally based on FSTEnum, the performance of term dict should be similar with MemoryPF. However, for PK tasks, we have to pull docsEnum from MMap, so this hurts. Following is the performance comparison: {noformat} pure TempFST vs. Lucene41 + Memory(on idField), on wikimediumall TaskQPS base StdDevQPS comp StdDev Pct diff Respell 48.13 (4.4%) 15.38 (1.0%) -68.0% ( -70% - -65%) Fuzzy2 51.30 (5.3%) 17.47 (1.3%) -65.9% ( -68% - -62%) Fuzzy1 52.24 (4.0%) 18.50 (1.2%) -64.6% ( -67% - -61%) Wildcard9.31 (1.7%)6.16 (2.2%) -33.8% ( -37% - -30%) Prefix3 23.25 (1.8%) 19.00 (2.2%) -18.3% ( -21% - -14%) PKLookup 244.92 (3.6%) 225.42 (2.3%) -8.0% ( -13% - -2%) LowTerm 295.88 (5.5%) 293.27 (4.8%) -0.9% ( -10% -9%) HighPhrase 13.62 (6.5%) 13.54 (7.4%) -0.6% ( -13% - 14%) MedTerm 99.51 (7.8%) 99.19 (7.7%) -0.3% ( -14% - 16%) MedPhrase 154.63 (9.4%) 154.38 (10.1%) -0.2% ( -17% - 21%) HighTerm 28.25 (10.7%) 28.25 (10.0%) -0.0% ( -18% - 23%) OrHighHigh 16.83 (13.3%) 16.86 (13.1%) 0.2% ( -23% - 30%) HighSloppyPhrase9.02 (4.4%)9.03 (4.5%) 0.2% ( -8% -9%) LowPhrase6.26 (3.4%)6.27 (4.1%) 0.2% ( -7% -8%) OrHighMed 13.73 (13.2%) 13.77 (12.8%) 0.3% ( -22% - 30%) OrHighLow 25.65 (13.2%) 25.73 (13.0%) 0.3% ( -22% - 30%) MedSloppyPhrase6.63 (2.7%)6.66 (2.7%) 0.5% ( -4% -6%) AndHighMed 42.77 (1.8%) 43.13 (1.5%) 0.8% ( -2% -4%) LowSloppyPhrase 32.68 (3.0%) 32.96 (2.8%) 0.8% ( -4% -6%) AndHighHigh 22.90 (1.2%) 23.18 (0.7%) 1.2% ( 0% -3%) LowSpanNear 29.30 (2.0%) 29.83 (2.2%) 1.8% ( -2% -6%) MedSpanNear8.39 (2.7%)8.56 (2.9%) 2.0% ( -3% -7%) IntNRQ3.12 (1.9%)3.18 (6.7%) 2.1% ( -6% - 10%) AndHighLow 507.01 (2.4%) 522.10 (2.8%) 3.0% ( -2% -8%) HighSpanNear5.43 (1.8%)5.60 (2.6%) 3.1% ( -1% -7%) {noformat} {noformat} pure TempFST vs. pure Lucene41, on wikimediumall TaskQPS base StdDevQPS comp StdDev Pct diff Respell 49.24 (2.7%) 15.51 (1.0%) -68.5% ( -70% - -66%) Fuzzy2 52.01 (4.8%) 17.61 (1.4%) -66.1% ( -68% - -63%) Fuzzy1 53.00 (4.0%) 18.62 (1.3%) -64.9% ( -67% - -62%) Wildcard9.37 (1.3%)6.15 (2.1%) -34.4% ( -37% - -31%) Prefix3 23.36 (0.8%) 18.96 (2.1%) -18.8% ( -21% - -16%) MedPhrase 155.86 (9.8%) 152.34 (9.7%) -2.3% ( -19% - 19%) LowPhrase6.33 (3.7%)6.23 (4.0%) -1.6% ( -8% -6%) HighPhrase 13.68 (7.2%) 13.49 (6.8%) -1.4% ( -14% - 13%) OrHighMed 13.78 (13.0%) 13.68 (12.7%) -0.8% ( -23% - 28%) HighSloppyPhrase9.14 (5.2%)9.07 (3.7%) -0.7% ( -9% -8%) OrHighHigh 16.87 (13.3%) 16.76 (12.9%) -0.6% ( -23% - 29%) OrHighLow 25.71 (13.1%) 25.58 (12.8%) -0.5% ( -23% - 29%)
[jira] [Created] (LUCENE-5105) IndexOptions.DOCS_AND_FREQS_AND_POSITIONS_AND_OFFSETS has no effect
milesli created LUCENE-5105: --- Summary: IndexOptions.DOCS_AND_FREQS_AND_POSITIONS_AND_OFFSETS has no effect Key: LUCENE-5105 URL: https://issues.apache.org/jira/browse/LUCENE-5105 Project: Lucene - Core Issue Type: Bug Environment: In lucene 4.2 Reporter: milesli In lucene 4.2 it is not effective to set indexOptions to DOCS_AND_FREQS_AND_POSITIONS_AND_OFFSETS, positions and offsets are also not stored with termvector. I have to set StoreTermVectorOffsets to true and set StoreTermVectorPositions to true that is effective . -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5100) BaseDocIdSetTestCase
[ https://issues.apache.org/jira/browse/LUCENE-5100?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706736#comment-13706736 ] ASF subversion and git services commented on LUCENE-5100: - Commit 1502448 from [~jpountz] [ https://svn.apache.org/r1502448 ] LUCENE-5100: BaseDocIdSetTestCase. BaseDocIdSetTestCase Key: LUCENE-5100 URL: https://issues.apache.org/jira/browse/LUCENE-5100 Project: Lucene - Core Issue Type: Improvement Reporter: Adrien Grand Assignee: Adrien Grand Priority: Trivial Attachments: LUCENE-5100.patch, LUCENE-5100.patch As Robert said on LUCENE-5081, we would benefit from having common testing infrastructure for our DocIdSet implementations. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5100) BaseDocIdSetTestCase
[ https://issues.apache.org/jira/browse/LUCENE-5100?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706739#comment-13706739 ] ASF subversion and git services commented on LUCENE-5100: - Commit 1502450 from [~jpountz] [ https://svn.apache.org/r1502450 ] LUCENE-5100: BaseDocIdSetTestCase (merged from r1502448). BaseDocIdSetTestCase Key: LUCENE-5100 URL: https://issues.apache.org/jira/browse/LUCENE-5100 Project: Lucene - Core Issue Type: Improvement Reporter: Adrien Grand Assignee: Adrien Grand Priority: Trivial Attachments: LUCENE-5100.patch, LUCENE-5100.patch As Robert said on LUCENE-5081, we would benefit from having common testing infrastructure for our DocIdSet implementations. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Closed] (LUCENE-5105) IndexOptions.DOCS_AND_FREQS_AND_POSITIONS_AND_OFFSETS has no effect
[ https://issues.apache.org/jira/browse/LUCENE-5105?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Adrien Grand closed LUCENE-5105. Resolution: Invalid IndexOptions only apply to the inverted index. For term vectors, please use the FieldType.setStoreTermVectors* methods. IndexOptions.DOCS_AND_FREQS_AND_POSITIONS_AND_OFFSETS has no effect --- Key: LUCENE-5105 URL: https://issues.apache.org/jira/browse/LUCENE-5105 Project: Lucene - Core Issue Type: Bug Environment: In lucene 4.2 Reporter: milesli In lucene 4.2 it is not effective to set indexOptions to DOCS_AND_FREQS_AND_POSITIONS_AND_OFFSETS, positions and offsets are also not stored with termvector. I have to set StoreTermVectorOffsets to true and set StoreTermVectorPositions to true that is effective . -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Resolved] (LUCENE-5100) BaseDocIdSetTestCase
[ https://issues.apache.org/jira/browse/LUCENE-5100?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Adrien Grand resolved LUCENE-5100. -- Resolution: Fixed Fix Version/s: 4.5 BaseDocIdSetTestCase Key: LUCENE-5100 URL: https://issues.apache.org/jira/browse/LUCENE-5100 Project: Lucene - Core Issue Type: Improvement Reporter: Adrien Grand Assignee: Adrien Grand Priority: Trivial Fix For: 4.5 Attachments: LUCENE-5100.patch, LUCENE-5100.patch As Robert said on LUCENE-5081, we would benefit from having common testing infrastructure for our DocIdSet implementations. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5098) Broadword bit selection
[ https://issues.apache.org/jira/browse/LUCENE-5098?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706750#comment-13706750 ] Dawid Weiss commented on LUCENE-5098: - bq. I played a bit with it and rank9 was always between 15% and 20% slower than bitCount no matter what the input was (which is still impressing since bitCount is supposed to be an intrinsic) I also toyed with this a bit. Interesting because rank9 is essentially Hacker's delight implementation, but with a multiplication at the end rather than shifts/ additions (which I think originated from the fact that multiplication used to be much slower than additions/shifts back in the day). I ran a few caliper bechmarks on a million longs, different distributions (Intel I7, JDK17) just to see what the output will be. {code} benchmark distribution us linear runtime HdPopCndZEROS 2333 = HdPopCnd FULL 2334 = HdPopCnd RANDOM 2329 = HdPopCnd ONEBIT 2334 = Rank9ZEROS 1651 = Rank9 FULL 1652 = Rank9 RANDOM 1651 = Rank9 ONEBIT 1651 = LongBitCountZEROS411 = LongBitCount FULL394 = LongBitCount RANDOM391 = LongBitCount ONEBIT404 = NaivePopCntZEROS585 = NaivePopCnt FULL 39019 == NaivePopCnt RANDOM 171365 == NaivePopCnt ONEBIT 28155 {code} The naive loop was: {code} int cnt = 0; while (x != 0) { if (((x = 1) 1) != 0L) { cnt++; } } return cnt; {code} and you can see that even for all zeros (when in fact there is no counting at all) it's still slower than the intrinsified popcnt. Note full-ones is not the worst case (I believe due to constant branch misprediction in a random input). A zoomed-in benchmark without the naive impl.: {code} benchmark distribution us linear runtime HdPopCndZEROS 2331 = HdPopCnd FULL 2329 = HdPopCnd RANDOM 2333 = HdPopCnd ONEBIT 2333 = Rank9ZEROS 1650 = Rank9 FULL 1650 = Rank9 RANDOM 1652 = Rank9 ONEBIT 1650 = LongBitCountZEROS 400 = LongBitCount FULL 402 = LongBitCount RANDOM 401 = LongBitCount ONEBIT 391 = {code} You can see when popcnt isn't intrinsified by running with IBM's J9, for example: {code} benchmark distribution ms linear runtime LongBitCountZEROS 2.25 = LongBitCount FULL 2.22 = LongBitCount RANDOM 2.25 = LongBitCount ONEBIT 2.25 = HdPopCndZEROS 2.25 = HdPopCnd FULL 2.25 = HdPopCnd RANDOM 2.22 = HdPopCnd ONEBIT 2.22 = Rank9ZEROS 1.62 = Rank9 FULL 1.62 = Rank9 RANDOM 1.62 = Rank9 ONEBIT 1.62 = {code} But I think they'll eventually catch up with modern cpus too so I'd stick with Long.bitCount. Broadword bit selection --- Key: LUCENE-5098 URL: https://issues.apache.org/jira/browse/LUCENE-5098 Project: Lucene - Core Issue Type: Improvement Components: core/other Reporter: Paul Elschot Assignee: Adrien Grand Priority: Minor Attachments: LUCENE-5098.patch -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-4997) The splitshard api doesn't call commit on new sub shards
[ https://issues.apache.org/jira/browse/SOLR-4997?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706751#comment-13706751 ] ASF subversion and git services commented on SOLR-4997: --- Commit 1502458 from sha...@apache.org [ https://svn.apache.org/r1502458 ] SOLR-4997: The splitshard api doesn't call commit on new sub shards before switching shard states. Multiple bugs related to sub shard recovery and replication are also fixed. The splitshard api doesn't call commit on new sub shards Key: SOLR-4997 URL: https://issues.apache.org/jira/browse/SOLR-4997 Project: Solr Issue Type: Bug Components: SolrCloud Affects Versions: 4.3, 4.3.1 Reporter: Shalin Shekhar Mangar Assignee: Shalin Shekhar Mangar Fix For: 4.4 Attachments: SOLR-4997.patch, SOLR-4997.patch The splitshard api doesn't call commit on new sub shards but it happily sets them to active state which means on a successful split, the documents are not visible to searchers unless an explicit commit is called on the cluster. The coreadmin split api will still not call commit on targetCores. That is by design and we're not going to change that. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-4997) The splitshard api doesn't call commit on new sub shards
[ https://issues.apache.org/jira/browse/SOLR-4997?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706755#comment-13706755 ] ASF subversion and git services commented on SOLR-4997: --- Commit 1502460 from sha...@apache.org [ https://svn.apache.org/r1502460 ] SOLR-4997: The splitshard api doesn't call commit on new sub shards before switching shard states. Multiple bugs related to sub shard recovery and replication are also fixed. The splitshard api doesn't call commit on new sub shards Key: SOLR-4997 URL: https://issues.apache.org/jira/browse/SOLR-4997 Project: Solr Issue Type: Bug Components: SolrCloud Affects Versions: 4.3, 4.3.1 Reporter: Shalin Shekhar Mangar Assignee: Shalin Shekhar Mangar Fix For: 4.4 Attachments: SOLR-4997.patch, SOLR-4997.patch The splitshard api doesn't call commit on new sub shards but it happily sets them to active state which means on a successful split, the documents are not visible to searchers unless an explicit commit is called on the cluster. The coreadmin split api will still not call commit on targetCores. That is by design and we're not going to change that. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-4914) Factor out core discovery and persistence logic
[ https://issues.apache.org/jira/browse/SOLR-4914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706788#comment-13706788 ] ASF subversion and git services commented on SOLR-4914: --- Commit 1502468 from [~romseygeek] [ https://svn.apache.org/r1502468 ] SOLR-4914: Close OutputStreamWriter properly, use System.getProperty(line.separator) instead of \n Fixes Windows test failures. Factor out core discovery and persistence logic --- Key: SOLR-4914 URL: https://issues.apache.org/jira/browse/SOLR-4914 Project: Solr Issue Type: Improvement Affects Versions: 5.0 Reporter: Erick Erickson Assignee: Alan Woodward Attachments: SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch Alan Woodward has done some work to refactor how core persistence works that we should work on going forward that I want to separate from a shorter-term tactical problem (See SOLR-4910). I'm attaching Alan's patch to this JIRA and we'll carry it forward separately from 4910. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-4914) Factor out core discovery and persistence logic
[ https://issues.apache.org/jira/browse/SOLR-4914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706789#comment-13706789 ] ASF subversion and git services commented on SOLR-4914: --- Commit 1502469 from [~romseygeek] [ https://svn.apache.org/r1502469 ] SOLR-4914: Close OutputStreamWriter, use platform-independent newlines in tests Factor out core discovery and persistence logic --- Key: SOLR-4914 URL: https://issues.apache.org/jira/browse/SOLR-4914 Project: Solr Issue Type: Improvement Affects Versions: 5.0 Reporter: Erick Erickson Assignee: Alan Woodward Attachments: SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch Alan Woodward has done some work to refactor how core persistence works that we should work on going forward that I want to separate from a shorter-term tactical problem (See SOLR-4910). I'm attaching Alan's patch to this JIRA and we'll carry it forward separately from 4910. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
RE: svn commit: r1502468 - in /lucene/dev/trunk/solr/core/src: java/org/apache/solr/core/CorePropertiesLocator.java test/org/apache/solr/core/TestSolrXmlPersistor.java
Hi, you have tob e careful: If you store properties with a writer but load it with InputStream, the code is different. Properties files have a defined charset of ISO-8859-1: The load(Reader) / store(Writer, String) methods load and store properties from and to a character based stream in a simple line-oriented format specified below. The load(InputStream) / store(OutputStream, String) methods work the same way as the load(Reader)/store(Writer, String) pair, except the input/output stream is encoded in ISO 8859-1 character encoding. Characters that cannot be directly represented in this encoding can be written using Unicode escapes as defined in section 3.3 of The Java™ Language Specification; only a single 'u' character is allowed in an escape sequence. The native2ascii tool can be used to convert property files to and from other character encodings. So be sure to be consistent when loading/saving! If we previously (in older Solr version) used the InputStream methods to load/store core props, we should use ISO-8859-1 to load/store to be compatible with older versions! Uwe - Uwe Schindler H.-H.-Meier-Allee 63, D-28213 Bremen http://www.thetaphi.de eMail: u...@thetaphi.de -Original Message- From: romseyg...@apache.org [mailto:romseyg...@apache.org] Sent: Friday, July 12, 2013 10:26 AM To: comm...@lucene.apache.org Subject: svn commit: r1502468 - in /lucene/dev/trunk/solr/core/src: java/org/apache/solr/core/CorePropertiesLocator.java test/org/apache/solr/core/TestSolrXmlPersistor.java Author: romseygeek Date: Fri Jul 12 08:25:36 2013 New Revision: 1502468 URL: http://svn.apache.org/r1502468 Log: SOLR-4914: Close OutputStreamWriter properly, use System.getProperty(line.separator) instead of \n Fixes Windows test failures. Modified: lucene/dev/trunk/solr/core/src/java/org/apache/solr/core/CorePropertiesL ocator.java lucene/dev/trunk/solr/core/src/test/org/apache/solr/core/TestSolrXmlPersi stor.java Modified: lucene/dev/trunk/solr/core/src/java/org/apache/solr/core/CorePropertiesL ocator.java URL: http://svn.apache.org/viewvc/lucene/dev/trunk/solr/core/src/java/org/apa che/solr/core/CorePropertiesLocator.java?rev=1502468r1=1502467r2=150 2468view=diff == --- lucene/dev/trunk/solr/core/src/java/org/apache/solr/core/CorePropertiesL ocator.java (original) +++ lucene/dev/trunk/solr/core/src/java/org/apache/solr/core/CorePropert +++ iesLocator.java Fri Jul 12 08:25:36 2013 @@ -20,6 +20,7 @@ package org.apache.solr.core; import com.google.common.base.Charsets; import com.google.common.collect.Lists; import org.apache.solr.common.SolrException; +import org.apache.solr.util.IOUtils; import org.slf4j.Logger; import org.slf4j.LoggerFactory; @@ -27,6 +28,7 @@ import java.io.File; import java.io.FileInputStream; import java.io.FileOutputStream; import java.io.IOException; +import java.io.OutputStream; import java.io.OutputStreamWriter; import java.io.Writer; import java.util.Date; @@ -56,14 +58,7 @@ public class CorePropertiesLocator imple throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, Could not create a new core in + cd.getInstanceDir() + as another core is already defined there); - try { -Properties p = buildCoreProperties(cd); -Writer writer = new OutputStreamWriter(new FileOutputStream(propFile), Charsets.UTF_8); -p.store(writer, Written by CorePropertiesLocator on + new Date()); - } - catch (IOException e) { -logger.error(Couldn't persist core properties to {}: {}, propFile.getAbsolutePath(), e); - } + writePropertiesFile(cd, propFile); } } @@ -75,14 +70,25 @@ public class CorePropertiesLocator imple public void persist(CoreContainer cc, CoreDescriptor... coreDescriptors) { for (CoreDescriptor cd : coreDescriptors) { File propFile = new File(new File(cd.getInstanceDir()), PROPERTIES_FILENAME); - try { -Properties p = buildCoreProperties(cd); -Writer writer = new OutputStreamWriter(new FileOutputStream(propFile), Charsets.UTF_8); -p.store(writer, Written by CorePropertiesLocator on + new Date()); - } - catch (IOException e) { -logger.error(Couldn't persist core properties to {}: {}, propFile.getAbsolutePath(), e); - } + writePropertiesFile(cd, propFile); +} + } + + private void writePropertiesFile(CoreDescriptor cd, File propfile) { +Properties p = buildCoreProperties(cd); +OutputStream os = null; +try { + os = new FileOutputStream(propfile); + Writer writer = new OutputStreamWriter(os, Charsets.UTF_8); + p.store(writer, Written by CorePropertiesLocator on + new Date()); + writer.close(); +} +catch (IOException e) { +
Re: svn commit: r1502468 - in /lucene/dev/trunk/solr/core/src: java/org/apache/solr/core/CorePropertiesLocator.java test/org/apache/solr/core/TestSolrXmlPersistor.java
Does this fix it? @@ -79,9 +76,7 @@ public class CorePropertiesLocator implements CoresLocator { OutputStream os = null; try { os = new FileOutputStream(propfile); - Writer writer = new OutputStreamWriter(os, Charsets.UTF_8); - p.store(writer, Written by CorePropertiesLocator on + new Date()); - writer.close(); + p.store(os, Written by CorePropertiesLocator on + new Date()); } catch (IOException e) { Alan Woodward www.flax.co.uk On 12 Jul 2013, at 09:35, Uwe Schindler wrote: Hi, you have tob e careful: If you store properties with a writer but load it with InputStream, the code is different. Properties files have a defined charset of ISO-8859-1: The load(Reader) / store(Writer, String) methods load and store properties from and to a character based stream in a simple line-oriented format specified below. The load(InputStream) / store(OutputStream, String) methods work the same way as the load(Reader)/store(Writer, String) pair, except the input/output stream is encoded in ISO 8859-1 character encoding. Characters that cannot be directly represented in this encoding can be written using Unicode escapes as defined in section 3.3 of The Java™ Language Specification; only a single 'u' character is allowed in an escape sequence. The native2ascii tool can be used to convert property files to and from other character encodings. So be sure to be consistent when loading/saving! If we previously (in older Solr version) used the InputStream methods to load/store core props, we should use ISO-8859-1 to load/store to be compatible with older versions! Uwe - Uwe Schindler H.-H.-Meier-Allee 63, D-28213 Bremen http://www.thetaphi.de eMail: u...@thetaphi.de -Original Message- From: romseyg...@apache.org [mailto:romseyg...@apache.org] Sent: Friday, July 12, 2013 10:26 AM To: comm...@lucene.apache.org Subject: svn commit: r1502468 - in /lucene/dev/trunk/solr/core/src: java/org/apache/solr/core/CorePropertiesLocator.java test/org/apache/solr/core/TestSolrXmlPersistor.java Author: romseygeek Date: Fri Jul 12 08:25:36 2013 New Revision: 1502468 URL: http://svn.apache.org/r1502468 Log: SOLR-4914: Close OutputStreamWriter properly, use System.getProperty(line.separator) instead of \n Fixes Windows test failures. Modified: lucene/dev/trunk/solr/core/src/java/org/apache/solr/core/CorePropertiesL ocator.java lucene/dev/trunk/solr/core/src/test/org/apache/solr/core/TestSolrXmlPersi stor.java Modified: lucene/dev/trunk/solr/core/src/java/org/apache/solr/core/CorePropertiesL ocator.java URL: http://svn.apache.org/viewvc/lucene/dev/trunk/solr/core/src/java/org/apa che/solr/core/CorePropertiesLocator.java?rev=1502468r1=1502467r2=150 2468view=diff == --- lucene/dev/trunk/solr/core/src/java/org/apache/solr/core/CorePropertiesL ocator.java (original) +++ lucene/dev/trunk/solr/core/src/java/org/apache/solr/core/CorePropert +++ iesLocator.java Fri Jul 12 08:25:36 2013 @@ -20,6 +20,7 @@ package org.apache.solr.core; import com.google.common.base.Charsets; import com.google.common.collect.Lists; import org.apache.solr.common.SolrException; +import org.apache.solr.util.IOUtils; import org.slf4j.Logger; import org.slf4j.LoggerFactory; @@ -27,6 +28,7 @@ import java.io.File; import java.io.FileInputStream; import java.io.FileOutputStream; import java.io.IOException; +import java.io.OutputStream; import java.io.OutputStreamWriter; import java.io.Writer; import java.util.Date; @@ -56,14 +58,7 @@ public class CorePropertiesLocator imple throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, Could not create a new core in + cd.getInstanceDir() + as another core is already defined there); - try { -Properties p = buildCoreProperties(cd); -Writer writer = new OutputStreamWriter(new FileOutputStream(propFile), Charsets.UTF_8); -p.store(writer, Written by CorePropertiesLocator on + new Date()); - } - catch (IOException e) { -logger.error(Couldn't persist core properties to {}: {}, propFile.getAbsolutePath(), e); - } + writePropertiesFile(cd, propFile); } } @@ -75,14 +70,25 @@ public class CorePropertiesLocator imple public void persist(CoreContainer cc, CoreDescriptor... coreDescriptors) { for (CoreDescriptor cd : coreDescriptors) { File propFile = new File(new File(cd.getInstanceDir()), PROPERTIES_FILENAME); - try { -Properties p = buildCoreProperties(cd); -Writer writer = new OutputStreamWriter(new FileOutputStream(propFile), Charsets.UTF_8); -p.store(writer, Written by CorePropertiesLocator on + new Date()); - } - catch (IOException e) { -logger.error(Couldn't
RE: svn commit: r1502468 - in /lucene/dev/trunk/solr/core/src: java/org/apache/solr/core/CorePropertiesLocator.java test/org/apache/solr/core/TestSolrXmlPersistor.java
Yes, and be sure to do the opposite when reading properties files! Otherwise it is not consistent and loading/saving and exceptions may happen. I am working on forbidden-apis to ensure we are consistent everywhere. The binary properties file format is the officially defined one according to JVM spec. Uwe - Uwe Schindler H.-H.-Meier-Allee 63, D-28213 Bremen http://www.thetaphi.de eMail: u...@thetaphi.de -Original Message- From: Alan Woodward [mailto:a...@flax.co.uk] Sent: Friday, July 12, 2013 10:47 AM To: dev@lucene.apache.org Subject: Re: svn commit: r1502468 - in /lucene/dev/trunk/solr/core/src: java/org/apache/solr/core/CorePropertiesLocator.java test/org/apache/solr/core/TestSolrXmlPersistor.java Does this fix it? @@ -79,9 +76,7 @@ public class CorePropertiesLocator implements CoresLocator { OutputStream os = null; try { os = new FileOutputStream(propfile); - Writer writer = new OutputStreamWriter(os, Charsets.UTF_8); - p.store(writer, Written by CorePropertiesLocator on + new Date()); - writer.close(); + p.store(os, Written by CorePropertiesLocator on + new Date()); } catch (IOException e) { Alan Woodward www.flax.co.uk On 12 Jul 2013, at 09:35, Uwe Schindler wrote: Hi, you have tob e careful: If you store properties with a writer but load it with InputStream, the code is different. Properties files have a defined charset of ISO-8859-1: The load(Reader) / store(Writer, String) methods load and store properties from and to a character based stream in a simple line-oriented format specified below. The load(InputStream) / store(OutputStream, String) methods work the same way as the load(Reader)/store(Writer, String) pair, except the input/output stream is encoded in ISO 8859-1 character encoding. Characters that cannot be directly represented in this encoding can be written using Unicode escapes as defined in section 3.3 of The Java™ Language Specification; only a single 'u' character is allowed in an escape sequence. The native2ascii tool can be used to convert property files to and from other character encodings. So be sure to be consistent when loading/saving! If we previously (in older Solr version) used the InputStream methods to load/store core props, we should use ISO-8859-1 to load/store to be compatible with older versions! Uwe - Uwe Schindler H.-H.-Meier-Allee 63, D-28213 Bremen http://www.thetaphi.de eMail: u...@thetaphi.de -Original Message- From: romseyg...@apache.org [mailto:romseyg...@apache.org] Sent: Friday, July 12, 2013 10:26 AM To: comm...@lucene.apache.org Subject: svn commit: r1502468 - in /lucene/dev/trunk/solr/core/src: java/org/apache/solr/core/CorePropertiesLocator.java test/org/apache/solr/core/TestSolrXmlPersistor.java Author: romseygeek Date: Fri Jul 12 08:25:36 2013 New Revision: 1502468 URL: http://svn.apache.org/r1502468 Log: SOLR-4914: Close OutputStreamWriter properly, use System.getProperty(line.separator) instead of \n Fixes Windows test failures. Modified: lucene/dev/trunk/solr/core/src/java/org/apache/solr/core/CoreProperti esL ocator.java lucene/dev/trunk/solr/core/src/test/org/apache/solr/core/TestSolrXmlP ersi stor.java Modified: lucene/dev/trunk/solr/core/src/java/org/apache/solr/core/CoreProperti esL ocator.java URL: http://svn.apache.org/viewvc/lucene/dev/trunk/solr/core/src/java/org/ apa che/solr/core/CorePropertiesLocator.java?rev=1502468r1=1502467r2=15 0 2468view=diff == --- lucene/dev/trunk/solr/core/src/java/org/apache/solr/core/CoreProperti esL ocator.java (original) +++ lucene/dev/trunk/solr/core/src/java/org/apache/solr/core/CoreProp +++ ert iesLocator.java Fri Jul 12 08:25:36 2013 @@ -20,6 +20,7 @@ package org.apache.solr.core; import com.google.common.base.Charsets; import com.google.common.collect.Lists; import org.apache.solr.common.SolrException; +import org.apache.solr.util.IOUtils; import org.slf4j.Logger; import org.slf4j.LoggerFactory; @@ -27,6 +28,7 @@ import java.io.File; import java.io.FileInputStream; import java.io.FileOutputStream; import java.io.IOException; +import java.io.OutputStream; import java.io.OutputStreamWriter; import java.io.Writer; import java.util.Date; @@ -56,14 +58,7 @@ public class CorePropertiesLocator imple throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, Could not create a new core in + cd.getInstanceDir() + as another core is already defined there); - try { -Properties p = buildCoreProperties(cd); -Writer writer = new OutputStreamWriter(new FileOutputStream(propFile), Charsets.UTF_8); -p.store(writer, Written by
[jira] [Commented] (SOLR-4914) Factor out core discovery and persistence logic
[ https://issues.apache.org/jira/browse/SOLR-4914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706798#comment-13706798 ] ASF subversion and git services commented on SOLR-4914: --- Commit 1502481 from [~romseygeek] [ https://svn.apache.org/r1502481 ] SOLR-4914: Use Properties.store(OutputStream, String) for back compatibility Factor out core discovery and persistence logic --- Key: SOLR-4914 URL: https://issues.apache.org/jira/browse/SOLR-4914 Project: Solr Issue Type: Improvement Affects Versions: 5.0 Reporter: Erick Erickson Assignee: Alan Woodward Attachments: SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch Alan Woodward has done some work to refactor how core persistence works that we should work on going forward that I want to separate from a shorter-term tactical problem (See SOLR-4910). I'm attaching Alan's patch to this JIRA and we'll carry it forward separately from 4910. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-4914) Factor out core discovery and persistence logic
[ https://issues.apache.org/jira/browse/SOLR-4914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706799#comment-13706799 ] ASF subversion and git services commented on SOLR-4914: --- Commit 1502483 from [~romseygeek] [ https://svn.apache.org/r1502483 ] SOLR-4914: Use Properties.store(OutputStream, String) for back compatibility Factor out core discovery and persistence logic --- Key: SOLR-4914 URL: https://issues.apache.org/jira/browse/SOLR-4914 Project: Solr Issue Type: Improvement Affects Versions: 5.0 Reporter: Erick Erickson Assignee: Alan Woodward Attachments: SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch Alan Woodward has done some work to refactor how core persistence works that we should work on going forward that I want to separate from a shorter-term tactical problem (See SOLR-4910). I'm attaching Alan's patch to this JIRA and we'll carry it forward separately from 4910. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-Tests-trunk-Java7 - Build # 4135 - Failure
Build: https://builds.apache.org/job/Lucene-Solr-Tests-trunk-Java7/4135/ 2 tests failed. FAILED: junit.framework.TestSuite.org.apache.solr.cloud.BasicDistributedZkTest Error Message: 1 thread leaked from SUITE scope at org.apache.solr.cloud.BasicDistributedZkTest: 1) Thread[id=728, name=recoveryCmdExecutor-230-thread-1, state=RUNNABLE, group=TGRP-BasicDistributedZkTest] at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:391) at java.net.Socket.connect(Socket.java:579) at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:127) at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:180) at org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:294) at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:645) at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:480) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:805) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:784) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:365) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:180) at org.apache.solr.cloud.SyncStrategy$1.run(SyncStrategy.java:291) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:722) Stack Trace: com.carrotsearch.randomizedtesting.ThreadLeakError: 1 thread leaked from SUITE scope at org.apache.solr.cloud.BasicDistributedZkTest: 1) Thread[id=728, name=recoveryCmdExecutor-230-thread-1, state=RUNNABLE, group=TGRP-BasicDistributedZkTest] at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:391) at java.net.Socket.connect(Socket.java:579) at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:127) at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:180) at org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:294) at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:645) at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:480) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:805) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:784) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:365) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:180) at org.apache.solr.cloud.SyncStrategy$1.run(SyncStrategy.java:291) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:722) at __randomizedtesting.SeedInfo.seed([84E0BC333E6F3E96]:0) FAILED: junit.framework.TestSuite.org.apache.solr.cloud.BasicDistributedZkTest Error Message: There are still zombie threads that couldn't be terminated:1) Thread[id=728, name=recoveryCmdExecutor-230-thread-1, state=RUNNABLE, group=TGRP-BasicDistributedZkTest] at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:391)
[jira] [Updated] (SOLR-5035) decouple grouping functionality from the query component
[ https://issues.apache.org/jira/browse/SOLR-5035?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Isaac Hebsh updated SOLR-5035: -- Description: Currently, all of the grouping work is coupled into the QueryComponent. It seems that we can split the component into two different components, without writing even one extra line of code. Best example is, in method 'prepare' of the QueryComponent: {code:java} boolean grouping = params.getBool(GroupParams.GROUP, false); if (!grouping) { return; } {code} (Obviously, I forgot to mention that after that code, there are dozens of lines, which only relevant to grouping.) This is clearly unnecessary coupling... was: Currently, all of the grouping work is coupled into the QueryComponent. It seems that we can split the component into two different components, without writing even one extra line of code. Best example is, in method 'prepare' of the QueryComponent: {code:java} boolean grouping = params.getBool(GroupParams.GROUP, false); if (!grouping) { return; } {code} This is clearly unnecessary coupling... decouple grouping functionality from the query component Key: SOLR-5035 URL: https://issues.apache.org/jira/browse/SOLR-5035 Project: Solr Issue Type: Improvement Components: search Affects Versions: 4.3.1 Reporter: Isaac Hebsh Currently, all of the grouping work is coupled into the QueryComponent. It seems that we can split the component into two different components, without writing even one extra line of code. Best example is, in method 'prepare' of the QueryComponent: {code:java} boolean grouping = params.getBool(GroupParams.GROUP, false); if (!grouping) { return; } {code} (Obviously, I forgot to mention that after that code, there are dozens of lines, which only relevant to grouping.) This is clearly unnecessary coupling... -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-Tests-4.x-Java6 - Build # 1786 - Failure
Build: https://builds.apache.org/job/Lucene-Solr-Tests-4.x-Java6/1786/ 2 tests failed. FAILED: junit.framework.TestSuite.org.apache.solr.cloud.BasicDistributedZkTest Error Message: 1 thread leaked from SUITE scope at org.apache.solr.cloud.BasicDistributedZkTest: 1) Thread[id=3014, name=recoveryCmdExecutor-1318-thread-1, state=RUNNABLE, group=TGRP-BasicDistributedZkTest] at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:327) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:193) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:180) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:384) at java.net.Socket.connect(Socket.java:546) at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:127) at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:180) at org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:294) at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:645) at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:480) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:805) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:784) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:365) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:180) at org.apache.solr.cloud.SyncStrategy$1.run(SyncStrategy.java:291) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:679) Stack Trace: com.carrotsearch.randomizedtesting.ThreadLeakError: 1 thread leaked from SUITE scope at org.apache.solr.cloud.BasicDistributedZkTest: 1) Thread[id=3014, name=recoveryCmdExecutor-1318-thread-1, state=RUNNABLE, group=TGRP-BasicDistributedZkTest] at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:327) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:193) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:180) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:384) at java.net.Socket.connect(Socket.java:546) at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:127) at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:180) at org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:294) at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:645) at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:480) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:805) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:784) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:365) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:180) at org.apache.solr.cloud.SyncStrategy$1.run(SyncStrategy.java:291) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:679) at __randomizedtesting.SeedInfo.seed([EDEF0D5E5BF45D8D]:0) FAILED: junit.framework.TestSuite.org.apache.solr.cloud.BasicDistributedZkTest Error Message: There are still zombie threads that couldn't be terminated:1) Thread[id=3014, name=recoveryCmdExecutor-1318-thread-1, state=RUNNABLE, group=TGRP-BasicDistributedZkTest] at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:327) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:193) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:180) at
[jira] [Commented] (SOLR-4914) Factor out core discovery and persistence logic
[ https://issues.apache.org/jira/browse/SOLR-4914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706845#comment-13706845 ] ASF subversion and git services commented on SOLR-4914: --- Commit 1502507 from [~romseygeek] [ https://svn.apache.org/r1502507 ] SOLR-4914: Close input streams as well Factor out core discovery and persistence logic --- Key: SOLR-4914 URL: https://issues.apache.org/jira/browse/SOLR-4914 Project: Solr Issue Type: Improvement Affects Versions: 5.0 Reporter: Erick Erickson Assignee: Alan Woodward Attachments: SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch Alan Woodward has done some work to refactor how core persistence works that we should work on going forward that I want to separate from a shorter-term tactical problem (See SOLR-4910). I'm attaching Alan's patch to this JIRA and we'll carry it forward separately from 4910. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-4914) Factor out core discovery and persistence logic
[ https://issues.apache.org/jira/browse/SOLR-4914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706847#comment-13706847 ] ASF subversion and git services commented on SOLR-4914: --- Commit 1502508 from [~romseygeek] [ https://svn.apache.org/r1502508 ] SOLR-4914: Close input streams as well Factor out core discovery and persistence logic --- Key: SOLR-4914 URL: https://issues.apache.org/jira/browse/SOLR-4914 Project: Solr Issue Type: Improvement Affects Versions: 5.0 Reporter: Erick Erickson Assignee: Alan Woodward Attachments: SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch, SOLR-4914.patch Alan Woodward has done some work to refactor how core persistence works that we should work on going forward that I want to separate from a shorter-term tactical problem (See SOLR-4910). I'm attaching Alan's patch to this JIRA and we'll carry it forward separately from 4910. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
Re: svn commit: r1502468 - in /lucene/dev/trunk/solr/core/src: java/org/apache/solr/core/CorePropertiesLocator.java test/org/apache/solr/core/TestSolrXmlPersistor.java
Where is this 'officially defined one' according to JVM spec? Please give a reference, as the Reader api is just as well defined as the InputStream API. If we want consistency, I want the Reader one! Just so you know, i totally disagree with this. I refuse to use native2ascii. I think these commits banning the Reader API should be reverted. On Fri, Jul 12, 2013 at 4:53 AM, Uwe Schindler u...@thetaphi.de wrote: Yes, and be sure to do the opposite when reading properties files! Otherwise it is not consistent and loading/saving and exceptions may happen. I am working on forbidden-apis to ensure we are consistent everywhere. The binary properties file format is the officially defined one according to JVM spec. Uwe - Uwe Schindler H.-H.-Meier-Allee 63, D-28213 Bremen http://www.thetaphi.de eMail: u...@thetaphi.de -Original Message- From: Alan Woodward [mailto:a...@flax.co.uk] Sent: Friday, July 12, 2013 10:47 AM To: dev@lucene.apache.org Subject: Re: svn commit: r1502468 - in /lucene/dev/trunk/solr/core/src: java/org/apache/solr/core/CorePropertiesLocator.java test/org/apache/solr/core/TestSolrXmlPersistor.java Does this fix it? @@ -79,9 +76,7 @@ public class CorePropertiesLocator implements CoresLocator { OutputStream os = null; try { os = new FileOutputStream(propfile); - Writer writer = new OutputStreamWriter(os, Charsets.UTF_8); - p.store(writer, Written by CorePropertiesLocator on + new Date()); - writer.close(); + p.store(os, Written by CorePropertiesLocator on + new Date()); } catch (IOException e) { Alan Woodward www.flax.co.uk On 12 Jul 2013, at 09:35, Uwe Schindler wrote: Hi, you have tob e careful: If you store properties with a writer but load it with InputStream, the code is different. Properties files have a defined charset of ISO-8859-1: The load(Reader) / store(Writer, String) methods load and store properties from and to a character based stream in a simple line-oriented format specified below. The load(InputStream) / store(OutputStream, String) methods work the same way as the load(Reader)/store(Writer, String) pair, except the input/output stream is encoded in ISO 8859-1 character encoding. Characters that cannot be directly represented in this encoding can be written using Unicode escapes as defined in section 3.3 of The Java™ Language Specification; only a single 'u' character is allowed in an escape sequence. The native2ascii tool can be used to convert property files to and from other character encodings. So be sure to be consistent when loading/saving! If we previously (in older Solr version) used the InputStream methods to load/store core props, we should use ISO-8859-1 to load/store to be compatible with older versions! Uwe - Uwe Schindler H.-H.-Meier-Allee 63, D-28213 Bremen http://www.thetaphi.de eMail: u...@thetaphi.de -Original Message- From: romseyg...@apache.org [mailto:romseyg...@apache.org] Sent: Friday, July 12, 2013 10:26 AM To: comm...@lucene.apache.org Subject: svn commit: r1502468 - in /lucene/dev/trunk/solr/core/src: java/org/apache/solr/core/CorePropertiesLocator.java test/org/apache/solr/core/TestSolrXmlPersistor.java Author: romseygeek Date: Fri Jul 12 08:25:36 2013 New Revision: 1502468 URL: http://svn.apache.org/r1502468 Log: SOLR-4914: Close OutputStreamWriter properly, use System.getProperty(line.separator) instead of \n Fixes Windows test failures. Modified: lucene/dev/trunk/solr/core/src/java/org/apache/solr/core/CoreProperti esL ocator.java lucene/dev/trunk/solr/core/src/test/org/apache/solr/core/TestSolrXmlP ersi stor.java Modified: lucene/dev/trunk/solr/core/src/java/org/apache/solr/core/CoreProperti esL ocator.java URL: http://svn.apache.org/viewvc/lucene/dev/trunk/solr/core/src/java/org/ apa che/solr/core/CorePropertiesLocator.java?rev=1502468r1=1502467r2=15 0 2468view=diff == --- lucene/dev/trunk/solr/core/src/java/org/apache/solr/core/CoreProperti esL ocator.java (original) +++ lucene/dev/trunk/solr/core/src/java/org/apache/solr/core/CoreProp +++ ert iesLocator.java Fri Jul 12 08:25:36 2013 @@ -20,6 +20,7 @@ package org.apache.solr.core; import com.google.common.base.Charsets; import com.google.common.collect.Lists; import org.apache.solr.common.SolrException; +import org.apache.solr.util.IOUtils; import org.slf4j.Logger; import org.slf4j.LoggerFactory; @@ -27,6 +28,7 @@ import java.io.File; import java.io.FileInputStream; import java.io.FileOutputStream; import java.io.IOException; +import java.io.OutputStream; import java.io.OutputStreamWriter; import
RE: svn commit: r1502468 - in /lucene/dev/trunk/solr/core/src: java/org/apache/solr/core/CorePropertiesLocator.java test/org/apache/solr/core/TestSolrXmlPersistor.java
Hi, There are several reasons why I added this: The biggest issue is consistence: we should decide to use either one or the other, but not mixed. The recent commits on Solr were wrong in that case, because the wrote the files using a defined reader but read it using the InputStream and similar problems. The commit was to prevent this. I agree, both properties files formats are defined in JDK6 docs, but the “original” one defined by Sun once a while back was specified to be: “ISO-8859-1 and Unicode escapes”. And almost all the properties files out there are using this encoding, including all of the ones included in the JDK (see your JDK folder, all properties files there are in this format, one example: $JAVA_HOME/jre/lib/deploy/messages_ja.properties for a crazy one). I agree, for newer developments one should use a newer format, but the problem we have in Solr is that we are no longer able to read old properties files – which were always written in the Sun original specification format (see http://docs.oracle.com/javase/1.4.2/docs/api/java/util/Properties.html for the format). The commit was just there to make it consistent and enforce consistence – that’s all. We can discuss about that, maybe only enforce this for Solr. For the core load/save the properties files are written by machines only, so nobody edits them by hand. Lucene does not use properties files, that was the hole thing. Robert, please, before complaining again – don’t think about fucking Unicode only, think about standards defined long time ago (and the properties file format is one of those). This is just for consistency. The reasoning behind the whole thing is similar to my complaints about XML: XML also needs to be read through an InputStream because they are binary and charsetless (charset is part of the fileformat; application/xml and not text/xml is the MIME type). Properties files are somehow also binary J and were defined in the past to use Unicode Escapes and ISO-8859-1 charset (http://docs.oracle.com/javase/1.4.2/docs/api/java/util/Properties.html - that’s the oldest one I got). In general we should not use properties files at all, so I would personally forbid them completely, but Solr used them for longer time now. Uwe - Uwe Schindler H.-H.-Meier-Allee 63, D-28213 Bremen http://www.thetaphi.de/ http://www.thetaphi.de eMail: u...@thetaphi.de From: Robert Muir [mailto:rcm...@gmail.com] Sent: Friday, July 12, 2013 1:41 PM To: dev@lucene.apache.org Subject: Re: svn commit: r1502468 - in /lucene/dev/trunk/solr/core/src: java/org/apache/solr/core/CorePropertiesLocator.java test/org/apache/solr/core/TestSolrXmlPersistor.java Where is this 'officially defined one' according to JVM spec? Please give a reference, as the Reader api is just as well defined as the InputStream API. If we want consistency, I want the Reader one! Just so you know, i totally disagree with this. I refuse to use native2ascii. I think these commits banning the Reader API should be reverted. On Fri, Jul 12, 2013 at 4:53 AM, Uwe Schindler u...@thetaphi.de wrote: Yes, and be sure to do the opposite when reading properties files! Otherwise it is not consistent and loading/saving and exceptions may happen. I am working on forbidden-apis to ensure we are consistent everywhere. The binary properties file format is the officially defined one according to JVM spec. Uwe - Uwe Schindler H.-H.-Meier-Allee 63, D-28213 Bremen http://www.thetaphi.de eMail: u...@thetaphi.de -Original Message- From: Alan Woodward [mailto:a...@flax.co.uk] Sent: Friday, July 12, 2013 10:47 AM To: dev@lucene.apache.org Subject: Re: svn commit: r1502468 - in /lucene/dev/trunk/solr/core/src: java/org/apache/solr/core/CorePropertiesLocator.java test/org/apache/solr/core/TestSolrXmlPersistor.java Does this fix it? @@ -79,9 +76,7 @@ public class CorePropertiesLocator implements CoresLocator { OutputStream os = null; try { os = new FileOutputStream(propfile); - Writer writer = new OutputStreamWriter(os, Charsets.UTF_8); - p.store(writer, Written by CorePropertiesLocator on + new Date()); - writer.close(); + p.store(os, Written by CorePropertiesLocator on + new Date()); } catch (IOException e) { Alan Woodward www.flax.co.uk On 12 Jul 2013, at 09:35, Uwe Schindler wrote: Hi, you have tob e careful: If you store properties with a writer but load it with InputStream, the code is different. Properties files have a defined charset of ISO-8859-1: The load(Reader) / store(Writer, String) methods load and store properties from and to a character based stream in a simple line-oriented format specified below. The load(InputStream) / store(OutputStream, String) methods work the same way as the load(Reader)/store(Writer, String) pair, except the input/output stream is encoded in ISO 8859-1 character encoding. Characters
RE: svn commit: r1502468 - in /lucene/dev/trunk/solr/core/src: java/org/apache/solr/core/CorePropertiesLocator.java test/org/apache/solr/core/TestSolrXmlPersistor.java
I checked all possibilities we have, also by reading source code of JDK: We can have „one migration strategy”: - We forbid writing with OutputStream, so we won’t produce new ASCII-only files and we write new files as UTF-8. Older Solr versions no longer can read those files, but this is not a problem. - We forbid reading with InputStream, because that one can no longer read files written as UTF-8 without escapes. - We allow only reading by Reader and the Reader must be UTF-8 - this allows to still load old properties files loaded by older Solr versions (because when they were written in the old format, the reader code allows also Unicode escapes with \u, so “old-style” files are still parseable. So this would allow us to enforce the Reader/Writer API, if the charset to be used is UTF-8 Uwe - Uwe Schindler H.-H.-Meier-Allee 63, D-28213 Bremen http://www.thetaphi.de/ http://www.thetaphi.de eMail: u...@thetaphi.de From: Uwe Schindler [mailto:u...@thetaphi.de] Sent: Friday, July 12, 2013 2:09 PM To: dev@lucene.apache.org Subject: RE: svn commit: r1502468 - in /lucene/dev/trunk/solr/core/src: java/org/apache/solr/core/CorePropertiesLocator.java test/org/apache/solr/core/TestSolrXmlPersistor.java Hi, There are several reasons why I added this: The biggest issue is consistence: we should decide to use either one or the other, but not mixed. The recent commits on Solr were wrong in that case, because the wrote the files using a defined reader but read it using the InputStream and similar problems. The commit was to prevent this. I agree, both properties files formats are defined in JDK6 docs, but the “original” one defined by Sun once a while back was specified to be: “ISO-8859-1 and Unicode escapes”. And almost all the properties files out there are using this encoding, including all of the ones included in the JDK (see your JDK folder, all properties files there are in this format, one example: $JAVA_HOME/jre/lib/deploy/messages_ja.properties for a crazy one). I agree, for newer developments one should use a newer format, but the problem we have in Solr is that we are no longer able to read old properties files – which were always written in the Sun original specification format (see http://docs.oracle.com/javase/1.4.2/docs/api/java/util/Properties.html for the format). The commit was just there to make it consistent and enforce consistence – that’s all. We can discuss about that, maybe only enforce this for Solr. For the core load/save the properties files are written by machines only, so nobody edits them by hand. Lucene does not use properties files, that was the hole thing. Robert, please, before complaining again – don’t think about fucking Unicode only, think about standards defined long time ago (and the properties file format is one of those). This is just for consistency. The reasoning behind the whole thing is similar to my complaints about XML: XML also needs to be read through an InputStream because they are binary and charsetless (charset is part of the fileformat; application/xml and not text/xml is the MIME type). Properties files are somehow also binary J and were defined in the past to use Unicode Escapes and ISO-8859-1 charset (http://docs.oracle.com/javase/1.4.2/docs/api/java/util/Properties.html - that’s the oldest one I got). In general we should not use properties files at all, so I would personally forbid them completely, but Solr used them for longer time now. Uwe - Uwe Schindler H.-H.-Meier-Allee 63, D-28213 Bremen http://www.thetaphi.de/ http://www.thetaphi.de eMail: u...@thetaphi.de From: Robert Muir [mailto:rcm...@gmail.com] Sent: Friday, July 12, 2013 1:41 PM To: dev@lucene.apache.org Subject: Re: svn commit: r1502468 - in /lucene/dev/trunk/solr/core/src: java/org/apache/solr/core/CorePropertiesLocator.java test/org/apache/solr/core/TestSolrXmlPersistor.java Where is this 'officially defined one' according to JVM spec? Please give a reference, as the Reader api is just as well defined as the InputStream API. If we want consistency, I want the Reader one! Just so you know, i totally disagree with this. I refuse to use native2ascii. I think these commits banning the Reader API should be reverted. On Fri, Jul 12, 2013 at 4:53 AM, Uwe Schindler u...@thetaphi.de wrote: Yes, and be sure to do the opposite when reading properties files! Otherwise it is not consistent and loading/saving and exceptions may happen. I am working on forbidden-apis to ensure we are consistent everywhere. The binary properties file format is the officially defined one according to JVM spec. Uwe - Uwe Schindler H.-H.-Meier-Allee 63, D-28213 Bremen http://www.thetaphi.de eMail: u...@thetaphi.de -Original Message- From: Alan Woodward [mailto:a...@flax.co.uk] Sent: Friday, July 12, 2013 10:47 AM To: dev@lucene.apache.org Subject: Re: svn
[jira] [Commented] (SOLR-4997) The splitshard api doesn't call commit on new sub shards
[ https://issues.apache.org/jira/browse/SOLR-4997?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706899#comment-13706899 ] Erick Erickson commented on SOLR-4997: -- Shalin: I see there's a 4_4 branch already, are we sure it's on that branch too? Just don't want this to be lost in the shuffle. The splitshard api doesn't call commit on new sub shards Key: SOLR-4997 URL: https://issues.apache.org/jira/browse/SOLR-4997 Project: Solr Issue Type: Bug Components: SolrCloud Affects Versions: 4.3, 4.3.1 Reporter: Shalin Shekhar Mangar Assignee: Shalin Shekhar Mangar Fix For: 4.4 Attachments: SOLR-4997.patch, SOLR-4997.patch The splitshard api doesn't call commit on new sub shards but it happily sets them to active state which means on a successful split, the documents are not visible to searchers unless an explicit commit is called on the cluster. The coreadmin split api will still not call commit on targetCores. That is by design and we're not going to change that. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (SOLR-3633) web UI reports an error if CoreAdminHandler says there are no SolrCores
[ https://issues.apache.org/jira/browse/SOLR-3633?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Stefan Matheis (steffkes) updated SOLR-3633: Attachment: SOLR-3633.patch Updated Patch handles Core-Admin Page w/o any cores .. will add correct handling for Dropdown in the next Patch, hopefully coming shortly web UI reports an error if CoreAdminHandler says there are no SolrCores --- Key: SOLR-3633 URL: https://issues.apache.org/jira/browse/SOLR-3633 Project: Solr Issue Type: Bug Components: web gui Affects Versions: 4.0-ALPHA Reporter: Hoss Man Assignee: Stefan Matheis (steffkes) Fix For: 4.4 Attachments: SOLR-3633.patch, SOLR-3633.patch, SOLR-3633.patch, SOLR-3633.patch Spun off from SOLR-3591... * having no SolrCores is a valid situation * independent of what may happen in SOLR-3591, the web UI should cleanly deal with there being no SolrCores, and just hide/grey out any tabs that can't be supported w/o at least one core * even if there are no SolrCores the core admin features (ie: creating a new core) should be accessible in the UI -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (SOLR-3633) web UI reports an error if CoreAdminHandler says there are no SolrCores
[ https://issues.apache.org/jira/browse/SOLR-3633?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Stefan Matheis (steffkes) updated SOLR-3633: Attachment: SOLR-3633.patch Core-Selector is functional now, still needs a bit tweaking on the layout side web UI reports an error if CoreAdminHandler says there are no SolrCores --- Key: SOLR-3633 URL: https://issues.apache.org/jira/browse/SOLR-3633 Project: Solr Issue Type: Bug Components: web gui Affects Versions: 4.0-ALPHA Reporter: Hoss Man Assignee: Stefan Matheis (steffkes) Fix For: 4.4 Attachments: SOLR-3633.patch, SOLR-3633.patch, SOLR-3633.patch, SOLR-3633.patch, SOLR-3633.patch Spun off from SOLR-3591... * having no SolrCores is a valid situation * independent of what may happen in SOLR-3591, the web UI should cleanly deal with there being no SolrCores, and just hide/grey out any tabs that can't be supported w/o at least one core * even if there are no SolrCores the core admin features (ie: creating a new core) should be accessible in the UI -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-Tests-trunk-Java7 - Build # 4136 - Still Failing
Build: https://builds.apache.org/job/Lucene-Solr-Tests-trunk-Java7/4136/ 2 tests failed. FAILED: junit.framework.TestSuite.org.apache.solr.cloud.AliasIntegrationTest Error Message: 1 thread leaked from SUITE scope at org.apache.solr.cloud.AliasIntegrationTest: 1) Thread[id=269, name=recoveryCmdExecutor-83-thread-1, state=RUNNABLE, group=TGRP-AliasIntegrationTest] at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:391) at java.net.Socket.connect(Socket.java:579) at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:127) at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:180) at org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:294) at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:645) at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:480) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:805) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:784) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:365) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:180) at org.apache.solr.cloud.SyncStrategy$1.run(SyncStrategy.java:291) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:722) Stack Trace: com.carrotsearch.randomizedtesting.ThreadLeakError: 1 thread leaked from SUITE scope at org.apache.solr.cloud.AliasIntegrationTest: 1) Thread[id=269, name=recoveryCmdExecutor-83-thread-1, state=RUNNABLE, group=TGRP-AliasIntegrationTest] at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:391) at java.net.Socket.connect(Socket.java:579) at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:127) at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:180) at org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:294) at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:645) at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:480) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:805) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:784) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:365) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:180) at org.apache.solr.cloud.SyncStrategy$1.run(SyncStrategy.java:291) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:722) at __randomizedtesting.SeedInfo.seed([B1EB53235E72476C]:0) FAILED: junit.framework.TestSuite.org.apache.solr.cloud.AliasIntegrationTest Error Message: There are still zombie threads that couldn't be terminated:1) Thread[id=269, name=recoveryCmdExecutor-83-thread-1, state=RUNNABLE, group=TGRP-AliasIntegrationTest] at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:391) at
[jira] [Created] (LUCENE-5106) unban properties with unicode escapes
Robert Muir created LUCENE-5106: --- Summary: unban properties with unicode escapes Key: LUCENE-5106 URL: https://issues.apache.org/jira/browse/LUCENE-5106 Project: Lucene - Core Issue Type: Bug Affects Versions: 4.4 Reporter: Robert Muir Priority: Blocker As discussed on the mailing list, its just wrong to ban the use of unicode here. This blocks 4.4 (because it was committed there, too) -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (LUCENE-5106) unban properties with unicode escapes
[ https://issues.apache.org/jira/browse/LUCENE-5106?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Robert Muir updated LUCENE-5106: Attachment: LUCENE-5106.patch unban properties with unicode escapes - Key: LUCENE-5106 URL: https://issues.apache.org/jira/browse/LUCENE-5106 Project: Lucene - Core Issue Type: Bug Affects Versions: 4.4 Reporter: Robert Muir Priority: Blocker Attachments: LUCENE-5106.patch As discussed on the mailing list, its just wrong to ban the use of unicode here. This blocks 4.4 (because it was committed there, too) -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-5016) Spatial clustering/grouping
[ https://issues.apache.org/jira/browse/SOLR-5016?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706942#comment-13706942 ] Jeroen Steggink commented on SOLR-5016: --- Grid based faceting is also a kind of clustering, though not dynamic, but fixed, and would be a great addition to the spatial features. Region based clustering could indeed be solved by field value faceting on terms. However, if it wouldn't be based on terms but based on polygons it would be something different. I haven't yet tried to create facet queries based on BBOXs. Spatial clustering/grouping --- Key: SOLR-5016 URL: https://issues.apache.org/jira/browse/SOLR-5016 Project: Solr Issue Type: Wish Components: spatial Reporter: Jeroen Steggink Priority: Minor Labels: clustering, grouping, spatial Hi, It would be great if we could have some sort of spatial clustering/grouping of points for efficiently plotting them on a map. I could think of clustering based on the following parameters: - Based on regions: continents, countries, statis, cities, etc; - A fixed number of clusters; - Radius, bbox, polygon Retrieved result would give the center of the cluster, average location or a polygon of the cluster. An example of a usecase would be something like this: https://developers.google.com/maps/articles/toomanymarkers#markerclusterer Jeroen -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5106) unban properties with unicode escapes
[ https://issues.apache.org/jira/browse/LUCENE-5106?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706975#comment-13706975 ] Uwe Schindler commented on LUCENE-5106: --- In my opinion, we should keep the current thing for 4.4 and use the new one for 4.5 ? That was my original plan! unban properties with unicode escapes - Key: LUCENE-5106 URL: https://issues.apache.org/jira/browse/LUCENE-5106 Project: Lucene - Core Issue Type: Bug Affects Versions: 4.4 Reporter: Robert Muir Priority: Blocker Attachments: LUCENE-5106.patch As discussed on the mailing list, its just wrong to ban the use of unicode here. This blocks 4.4 (because it was committed there, too) -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5106) unban properties with unicode escapes
[ https://issues.apache.org/jira/browse/LUCENE-5106?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706978#comment-13706978 ] Robert Muir commented on LUCENE-5106: - As i stated, i disagree with the change of banning unicode, and think it should be reverted. So I opened this issue. unban properties with unicode escapes - Key: LUCENE-5106 URL: https://issues.apache.org/jira/browse/LUCENE-5106 Project: Lucene - Core Issue Type: Bug Affects Versions: 4.4 Reporter: Robert Muir Priority: Blocker Attachments: LUCENE-5106.patch As discussed on the mailing list, its just wrong to ban the use of unicode here. This blocks 4.4 (because it was committed there, too) -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5106) unban properties with unicode escapes
[ https://issues.apache.org/jira/browse/LUCENE-5106?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706981#comment-13706981 ] Uwe Schindler commented on LUCENE-5106: --- Ok, so what's your plan now? The idea was to ban *inconsistency* for 4.4. For 4.5 we have enough time to fix all code to *only* use Reader/Writer with 4.5. If we apply your patch, one could add a mixed one again (also for 4.4) - so a similar crazy thing like the one in SOLR-4914: The commit done by [~romseygeek] was the worst thing one could do, writing with UTF-8 enabled, but reading only with unicode-escapes allowed. So for 4.4, for maximum compatility we use the currently committed one for 4.4 branch only (only allowing consisten InputStream/OutputStream throughout the code! And in 4.5 we only allow the UTF-8 one, Reader/Writer throughout the code! For forbidden-apis (the original forbidden-apis), I plan to allow both by default. unban properties with unicode escapes - Key: LUCENE-5106 URL: https://issues.apache.org/jira/browse/LUCENE-5106 Project: Lucene - Core Issue Type: Bug Affects Versions: 4.4 Reporter: Robert Muir Priority: Blocker Attachments: LUCENE-5106.patch As discussed on the mailing list, its just wrong to ban the use of unicode here. This blocks 4.4 (because it was committed there, too) -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Comment Edited] (LUCENE-5106) unban properties with unicode escapes
[ https://issues.apache.org/jira/browse/LUCENE-5106?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706981#comment-13706981 ] Uwe Schindler edited comment on LUCENE-5106 at 7/12/13 2:54 PM: Ok, so what's your plan now? The idea was to ban *inconsistency* for 4.4. For 4.5 we have enough time to fix all code to *only* use Reader/Writer with 4.5. If we apply your patch, one could add a mixed one again (also for 4.4) - so a similar crazy thing like the one in SOLR-4914: The commit done by [~romseygeek] was the worst thing one could do, writing with UTF-8 enabled, but reading *only with unicode-escapes allowed*. So for 4.4, for maximum compatility we use the currently committed one for 4.4 branch only (only allowing consisten InputStream/OutputStream throughout the code! And in 4.5 we only allow the UTF-8 one, Reader/Writer throughout the code! This allows to still read files written by 4.4 and before, with unicode-escapes (because files written by old Lucene/Solr code from 4.4 and earlier) are still correctly decoded (The Reader {{load(Reader)}} method decodes unicode-escaped, too). For forbidden-apis (the original forbidden-apis), I plan to allow both by default. was (Author: thetaphi): Ok, so what's your plan now? The idea was to ban *inconsistency* for 4.4. For 4.5 we have enough time to fix all code to *only* use Reader/Writer with 4.5. If we apply your patch, one could add a mixed one again (also for 4.4) - so a similar crazy thing like the one in SOLR-4914: The commit done by [~romseygeek] was the worst thing one could do, writing with UTF-8 enabled, but reading only with unicode-escapes allowed. So for 4.4, for maximum compatility we use the currently committed one for 4.4 branch only (only allowing consisten InputStream/OutputStream throughout the code! And in 4.5 we only allow the UTF-8 one, Reader/Writer throughout the code! For forbidden-apis (the original forbidden-apis), I plan to allow both by default. unban properties with unicode escapes - Key: LUCENE-5106 URL: https://issues.apache.org/jira/browse/LUCENE-5106 Project: Lucene - Core Issue Type: Bug Affects Versions: 4.4 Reporter: Robert Muir Priority: Blocker Attachments: LUCENE-5106.patch As discussed on the mailing list, its just wrong to ban the use of unicode here. This blocks 4.4 (because it was committed there, too) -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Comment Edited] (LUCENE-5106) unban properties with unicode escapes
[ https://issues.apache.org/jira/browse/LUCENE-5106?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706981#comment-13706981 ] Uwe Schindler edited comment on LUCENE-5106 at 7/12/13 2:56 PM: Ok, so what's your plan now? The idea was to ban *inconsistency* for 4.4. For 4.5 we have enough time to fix all code to *only* use Reader/Writer with 4.5. If we apply your patch, one could add a mixed one again (also for 4.4) - so a similar crazy thing like the one in SOLR-4914: The commit done by [~romseygeek] was the worst thing one could do, writing with UTF-8 enabled, but reading *only with unicode-escapes allowed*. So for 4.4, for maximum compatility we use the currently committed one for 4.4 branch only (only allowing consisten InputStream/OutputStream throughout the code! And in 4.5 we only allow the UTF-8 one, Reader/Writer throughout the code! This allows to still read files written by 4.4 and before, with unicode-escapes (because files written by old Lucene/Solr code from 4.4 and earlier) are still correctly decoded (The Reader {{load(Reader)}} method decodes unicode-escaped, too). In fact, files written by the InputStream API are US-ASCII only (see src.zip). For forbidden-apis (the original forbidden-apis), I plan to allow both by default. was (Author: thetaphi): Ok, so what's your plan now? The idea was to ban *inconsistency* for 4.4. For 4.5 we have enough time to fix all code to *only* use Reader/Writer with 4.5. If we apply your patch, one could add a mixed one again (also for 4.4) - so a similar crazy thing like the one in SOLR-4914: The commit done by [~romseygeek] was the worst thing one could do, writing with UTF-8 enabled, but reading *only with unicode-escapes allowed*. So for 4.4, for maximum compatility we use the currently committed one for 4.4 branch only (only allowing consisten InputStream/OutputStream throughout the code! And in 4.5 we only allow the UTF-8 one, Reader/Writer throughout the code! This allows to still read files written by 4.4 and before, with unicode-escapes (because files written by old Lucene/Solr code from 4.4 and earlier) are still correctly decoded (The Reader {{load(Reader)}} method decodes unicode-escaped, too). For forbidden-apis (the original forbidden-apis), I plan to allow both by default. unban properties with unicode escapes - Key: LUCENE-5106 URL: https://issues.apache.org/jira/browse/LUCENE-5106 Project: Lucene - Core Issue Type: Bug Affects Versions: 4.4 Reporter: Robert Muir Priority: Blocker Attachments: LUCENE-5106.patch As discussed on the mailing list, its just wrong to ban the use of unicode here. This blocks 4.4 (because it was committed there, too) -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5106) unban properties with unicode escapes
[ https://issues.apache.org/jira/browse/LUCENE-5106?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706987#comment-13706987 ] Uwe Schindler commented on LUCENE-5106: --- Then you ris inconsistency, because your patch does not disallow InputStream/OutputStream unban properties with unicode escapes - Key: LUCENE-5106 URL: https://issues.apache.org/jira/browse/LUCENE-5106 Project: Lucene - Core Issue Type: Bug Affects Versions: 4.4 Reporter: Robert Muir Priority: Blocker Attachments: LUCENE-5106.patch As discussed on the mailing list, its just wrong to ban the use of unicode here. This blocks 4.4 (because it was committed there, too) -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5106) unban properties with unicode escapes
[ https://issues.apache.org/jira/browse/LUCENE-5106?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706986#comment-13706986 ] Robert Muir commented on LUCENE-5106: - This patch has all I want. unban properties with unicode escapes - Key: LUCENE-5106 URL: https://issues.apache.org/jira/browse/LUCENE-5106 Project: Lucene - Core Issue Type: Bug Affects Versions: 4.4 Reporter: Robert Muir Priority: Blocker Attachments: LUCENE-5106.patch As discussed on the mailing list, its just wrong to ban the use of unicode here. This blocks 4.4 (because it was committed there, too) -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Comment Edited] (LUCENE-5106) unban properties with unicode escapes
[ https://issues.apache.org/jira/browse/LUCENE-5106?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706981#comment-13706981 ] Uwe Schindler edited comment on LUCENE-5106 at 7/12/13 2:56 PM: Ok, so what's your plan now? The idea was to ban *inconsistency* for 4.4. For 4.5 we have enough time to fix all code to *only* use Reader/Writer with 4.5. If we apply your patch, one could add a mixed one again (also for 4.4) - so a similar crazy thing like the one in SOLR-4914: The commit done by [~romseygeek] was the worst thing one could do, writing with UTF-8 enabled, but reading *only with unicode-escapes allowed*. So for 4.4, for maximum compatility we use the currently committed one for 4.4 branch only (only allowing consisten InputStream/OutputStream throughout the code! And in 4.5 we only allow the UTF-8 one, Reader/Writer throughout the code! This allows to still read files written by 4.4 and before, with unicode-escapes (because files written by old Lucene/Solr code from 4.4 and earlier) are still correctly decoded (The Reader {{load(Reader)}} method decodes unicode-escaped, too). In fact, files written by the InputStream API are US-ASCII only, with everything 127 escaped (see src.zip). For forbidden-apis (the original forbidden-apis), I plan to allow both by default. was (Author: thetaphi): Ok, so what's your plan now? The idea was to ban *inconsistency* for 4.4. For 4.5 we have enough time to fix all code to *only* use Reader/Writer with 4.5. If we apply your patch, one could add a mixed one again (also for 4.4) - so a similar crazy thing like the one in SOLR-4914: The commit done by [~romseygeek] was the worst thing one could do, writing with UTF-8 enabled, but reading *only with unicode-escapes allowed*. So for 4.4, for maximum compatility we use the currently committed one for 4.4 branch only (only allowing consisten InputStream/OutputStream throughout the code! And in 4.5 we only allow the UTF-8 one, Reader/Writer throughout the code! This allows to still read files written by 4.4 and before, with unicode-escapes (because files written by old Lucene/Solr code from 4.4 and earlier) are still correctly decoded (The Reader {{load(Reader)}} method decodes unicode-escaped, too). In fact, files written by the InputStream API are US-ASCII only (see src.zip). For forbidden-apis (the original forbidden-apis), I plan to allow both by default. unban properties with unicode escapes - Key: LUCENE-5106 URL: https://issues.apache.org/jira/browse/LUCENE-5106 Project: Lucene - Core Issue Type: Bug Affects Versions: 4.4 Reporter: Robert Muir Priority: Blocker Attachments: LUCENE-5106.patch As discussed on the mailing list, its just wrong to ban the use of unicode here. This blocks 4.4 (because it was committed there, too) -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Comment Edited] (LUCENE-5106) unban properties with unicode escapes
[ https://issues.apache.org/jira/browse/LUCENE-5106?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706987#comment-13706987 ] Uwe Schindler edited comment on LUCENE-5106 at 7/12/13 2:58 PM: Then you risk inconsistency, because your patch does not disallow InputStream/OutputStream. This is why i want the current commited thing for 4.0. Without it, its a blocker, as [~romseygeek] could commit something similar again. *I don't want to ban unicode, I want all this legacy code be consistent only*. was (Author: thetaphi): Then you ris inconsistency, because your patch does not disallow InputStream/OutputStream unban properties with unicode escapes - Key: LUCENE-5106 URL: https://issues.apache.org/jira/browse/LUCENE-5106 Project: Lucene - Core Issue Type: Bug Affects Versions: 4.4 Reporter: Robert Muir Priority: Blocker Attachments: LUCENE-5106.patch As discussed on the mailing list, its just wrong to ban the use of unicode here. This blocks 4.4 (because it was committed there, too) -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Comment Edited] (LUCENE-5106) unban properties with unicode escapes
[ https://issues.apache.org/jira/browse/LUCENE-5106?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706987#comment-13706987 ] Uwe Schindler edited comment on LUCENE-5106 at 7/12/13 2:59 PM: Then you risk inconsistency, because your patch does not disallow InputStream/OutputStream. This is why i want the current commited thing for 4.4. Without it, its a blocker, as [~romseygeek] could commit something similar again. *I don't want to ban unicode, I want all this legacy code be consistent only*. was (Author: thetaphi): Then you risk inconsistency, because your patch does not disallow InputStream/OutputStream. This is why i want the current commited thing for 4.0. Without it, its a blocker, as [~romseygeek] could commit something similar again. *I don't want to ban unicode, I want all this legacy code be consistent only*. unban properties with unicode escapes - Key: LUCENE-5106 URL: https://issues.apache.org/jira/browse/LUCENE-5106 Project: Lucene - Core Issue Type: Bug Affects Versions: 4.4 Reporter: Robert Muir Priority: Blocker Attachments: LUCENE-5106.patch As discussed on the mailing list, its just wrong to ban the use of unicode here. This blocks 4.4 (because it was committed there, too) -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5106) unban properties with unicode escapes
[ https://issues.apache.org/jira/browse/LUCENE-5106?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706988#comment-13706988 ] Robert Muir commented on LUCENE-5106: - I asked to revert the (wrong) change, you didnt do it. so i opened a blocker issue. its really simple unban properties with unicode escapes - Key: LUCENE-5106 URL: https://issues.apache.org/jira/browse/LUCENE-5106 Project: Lucene - Core Issue Type: Bug Affects Versions: 4.4 Reporter: Robert Muir Priority: Blocker Attachments: LUCENE-5106.patch As discussed on the mailing list, its just wrong to ban the use of unicode here. This blocks 4.4 (because it was committed there, too) -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Comment Edited] (LUCENE-5106) unban properties with unicode escapes
[ https://issues.apache.org/jira/browse/LUCENE-5106?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706987#comment-13706987 ] Uwe Schindler edited comment on LUCENE-5106 at 7/12/13 3:01 PM: Then you risk inconsistency, because your patch does not disallow InputStream/OutputStream. This is why i want the current commited thing for 4.4. Without it, its a blocker, as [~romseygeek] could commit something similar again while backporting. *I don't want to ban unicode, I want all this legacy code be consistent only*. was (Author: thetaphi): Then you risk inconsistency, because your patch does not disallow InputStream/OutputStream. This is why i want the current commited thing for 4.4. Without it, its a blocker, as [~romseygeek] could commit something similar again. *I don't want to ban unicode, I want all this legacy code be consistent only*. unban properties with unicode escapes - Key: LUCENE-5106 URL: https://issues.apache.org/jira/browse/LUCENE-5106 Project: Lucene - Core Issue Type: Bug Affects Versions: 4.4 Reporter: Robert Muir Priority: Blocker Attachments: LUCENE-5106.patch As discussed on the mailing list, its just wrong to ban the use of unicode here. This blocks 4.4 (because it was committed there, too) -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5106) unban properties with unicode escapes
[ https://issues.apache.org/jira/browse/LUCENE-5106?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706991#comment-13706991 ] Uwe Schindler commented on LUCENE-5106: --- You don't understand the reason behind the change, don't you! I DONT WANT TO BAN UNICODE - I WANT TO ENFOCE CONSISTENCY AND CORRECTNESS OF THE EXISTING CODE! That's all. And because of that this code wil not be reverted by me, sorry. unban properties with unicode escapes - Key: LUCENE-5106 URL: https://issues.apache.org/jira/browse/LUCENE-5106 Project: Lucene - Core Issue Type: Bug Affects Versions: 4.4 Reporter: Robert Muir Priority: Blocker Attachments: LUCENE-5106.patch As discussed on the mailing list, its just wrong to ban the use of unicode here. This blocks 4.4 (because it was committed there, too) -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Created] (LUCENE-5107) Convert all Properties#store() and load() to use UTF-8 charset
Uwe Schindler created LUCENE-5107: - Summary: Convert all Properties#store() and load() to use UTF-8 charset Key: LUCENE-5107 URL: https://issues.apache.org/jira/browse/LUCENE-5107 Project: Lucene - Core Issue Type: Task Affects Versions: 4.4 Reporter: Uwe Schindler Priority: Blocker Fix For: 4.5 Followup of LUCENE-5106: This needs to be changed and the forbidden signatures changed to disallow InputStream/OutputStream and allow Reader/Writer only. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Assigned] (LUCENE-5107) Convert all Properties#store() and load() to use UTF-8 charset
[ https://issues.apache.org/jira/browse/LUCENE-5107?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Uwe Schindler reassigned LUCENE-5107: - Assignee: Uwe Schindler Convert all Properties#store() and load() to use UTF-8 charset -- Key: LUCENE-5107 URL: https://issues.apache.org/jira/browse/LUCENE-5107 Project: Lucene - Core Issue Type: Task Affects Versions: 4.4 Reporter: Uwe Schindler Assignee: Uwe Schindler Priority: Blocker Fix For: 4.5 Followup of LUCENE-5106: This needs to be changed and the forbidden signatures changed to disallow InputStream/OutputStream and allow Reader/Writer only. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Assigned] (LUCENE-5106) unban properties with unicode escapes
[ https://issues.apache.org/jira/browse/LUCENE-5106?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Uwe Schindler reassigned LUCENE-5106: - Assignee: Uwe Schindler unban properties with unicode escapes - Key: LUCENE-5106 URL: https://issues.apache.org/jira/browse/LUCENE-5106 Project: Lucene - Core Issue Type: Bug Affects Versions: 4.4 Reporter: Robert Muir Assignee: Uwe Schindler Priority: Blocker Attachments: LUCENE-5106.patch As discussed on the mailing list, its just wrong to ban the use of unicode here. This blocks 4.4 (because it was committed there, too) -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5106) unban properties with unicode escapes
[ https://issues.apache.org/jira/browse/LUCENE-5106?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706993#comment-13706993 ] Robert Muir commented on LUCENE-5106: - I'm not going to argue with you about this any more Uwe. I'm just asking you nicely to revert the change. unban properties with unicode escapes - Key: LUCENE-5106 URL: https://issues.apache.org/jira/browse/LUCENE-5106 Project: Lucene - Core Issue Type: Bug Affects Versions: 4.4 Reporter: Robert Muir Assignee: Uwe Schindler Priority: Blocker Attachments: LUCENE-5106.patch As discussed on the mailing list, its just wrong to ban the use of unicode here. This blocks 4.4 (because it was committed there, too) -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5107) Convert all Properties#store() and load() to use UTF-8 charset
[ https://issues.apache.org/jira/browse/LUCENE-5107?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706996#comment-13706996 ] Uwe Schindler commented on LUCENE-5107: --- In Lucene/Solr 4.5 we only allow the UTF-8 encoded properties files, so Reader/Writer throughout the code! This allows to still read files written by 4.4 and before with unicode-escapes are still correctly decoded (the Reader load(Reader) method decodes unicode-escaped, too). In fact, files written by the InputStream API are US-ASCII only, with everything 127 escaped (see src.zip) - so can also be loaded by an UTF-8 decoder, so the change breaks no existing files. Convert all Properties#store() and load() to use UTF-8 charset -- Key: LUCENE-5107 URL: https://issues.apache.org/jira/browse/LUCENE-5107 Project: Lucene - Core Issue Type: Task Affects Versions: 4.4 Reporter: Uwe Schindler Assignee: Uwe Schindler Priority: Blocker Fix For: 4.5 Followup of LUCENE-5106: This needs to be changed and the forbidden signatures changed to disallow InputStream/OutputStream and allow Reader/Writer only. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS-MAVEN] Lucene-Solr-Maven-4.x #384: POMs out of sync
Build: https://builds.apache.org/job/Lucene-Solr-Maven-4.x/384/ 2 tests failed. FAILED: org.apache.solr.cloud.BasicDistributedZkTest.org.apache.solr.cloud.BasicDistributedZkTest Error Message: 1 thread leaked from SUITE scope at org.apache.solr.cloud.BasicDistributedZkTest: 1) Thread[id=7722, name=recoveryCmdExecutor-4062-thread-1, state=RUNNABLE, group=TGRP-BasicDistributedZkTest] at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:327) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:193) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:180) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:384) at java.net.Socket.connect(Socket.java:546) at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:127) at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:180) at org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:294) at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:645) at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:480) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:805) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:784) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:365) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:180) at org.apache.solr.cloud.SyncStrategy$1.run(SyncStrategy.java:291) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:679) Stack Trace: com.carrotsearch.randomizedtesting.ThreadLeakError: 1 thread leaked from SUITE scope at org.apache.solr.cloud.BasicDistributedZkTest: 1) Thread[id=7722, name=recoveryCmdExecutor-4062-thread-1, state=RUNNABLE, group=TGRP-BasicDistributedZkTest] at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:327) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:193) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:180) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:384) at java.net.Socket.connect(Socket.java:546) at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:127) at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:180) at org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:294) at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:645) at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:480) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:805) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:784) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:365) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:180) at org.apache.solr.cloud.SyncStrategy$1.run(SyncStrategy.java:291) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:679) at __randomizedtesting.SeedInfo.seed([6F0C16A23337E6C6]:0) FAILED: org.apache.solr.cloud.BasicDistributedZkTest.org.apache.solr.cloud.BasicDistributedZkTest Error Message: There are still zombie threads that couldn't be terminated: 1) Thread[id=7722, name=recoveryCmdExecutor-4062-thread-1, state=RUNNABLE, group=TGRP-BasicDistributedZkTest] at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:327) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:193) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:180) at
[jira] [Comment Edited] (LUCENE-5107) Convert all Properties#store() and load() to use UTF-8 charset
[ https://issues.apache.org/jira/browse/LUCENE-5107?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13706996#comment-13706996 ] Uwe Schindler edited comment on LUCENE-5107 at 7/12/13 3:17 PM: In Lucene/Solr 4.5 we only allow the UTF-8 encoded properties files, so Reader/Writer throughout the code! This allows to still read files written by 4.4 and before with unicode-escapes (the Reader load(Reader) method decodes unicode-escaped, too). In fact, files written by the InputStream API are US-ASCII only, with everything 127 escaped (see src.zip) - so can also be loaded by an UTF-8 decoder, so the change breaks no existing files. was (Author: thetaphi): In Lucene/Solr 4.5 we only allow the UTF-8 encoded properties files, so Reader/Writer throughout the code! This allows to still read files written by 4.4 and before with unicode-escapes are still correctly decoded (the Reader load(Reader) method decodes unicode-escaped, too). In fact, files written by the InputStream API are US-ASCII only, with everything 127 escaped (see src.zip) - so can also be loaded by an UTF-8 decoder, so the change breaks no existing files. Convert all Properties#store() and load() to use UTF-8 charset -- Key: LUCENE-5107 URL: https://issues.apache.org/jira/browse/LUCENE-5107 Project: Lucene - Core Issue Type: Task Affects Versions: 4.4 Reporter: Uwe Schindler Assignee: Uwe Schindler Priority: Blocker Fix For: 4.5 Followup of LUCENE-5106: This needs to be changed and the forbidden signatures changed to disallow InputStream/OutputStream and allow Reader/Writer only. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Assigned] (SOLR-5001) Add new Solr book to the book list and slideshow
[ https://issues.apache.org/jira/browse/SOLR-5001?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Steve Rowe reassigned SOLR-5001: Assignee: Steve Rowe Add new Solr book to the book list and slideshow Key: SOLR-5001 URL: https://issues.apache.org/jira/browse/SOLR-5001 Project: Solr Issue Type: Improvement Components: documentation Affects Versions: 4.3.1 Environment: https://lucene.apache.org/solr/ Reporter: Alexandre Rafalovitch Assignee: Steve Rowe Priority: Minor Fix For: 4.4 Attachments: book_s4index.jpg, SOLR-5001.patch A new Solr book came out from Packt. I am providing the patch to update the website pages corresponding to slideshow on https://lucene.apache.org/solr/ and https://lucene.apache.org/solr/books.html . The patch has updates to html/text files and there is a binary image file as well. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-5001) Add new Solr book to the book list and slideshow
[ https://issues.apache.org/jira/browse/SOLR-5001?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707035#comment-13707035 ] Steve Rowe commented on SOLR-5001: -- [~arafalov], why did you remove one of the existing books? This seems extremely uncool, without consent from the author(s) and/or community consensus. Add new Solr book to the book list and slideshow Key: SOLR-5001 URL: https://issues.apache.org/jira/browse/SOLR-5001 Project: Solr Issue Type: Improvement Components: documentation Affects Versions: 4.3.1 Environment: https://lucene.apache.org/solr/ Reporter: Alexandre Rafalovitch Assignee: Steve Rowe Priority: Minor Fix For: 4.4 Attachments: book_s4index.jpg, SOLR-5001.patch A new Solr book came out from Packt. I am providing the patch to update the website pages corresponding to slideshow on https://lucene.apache.org/solr/ and https://lucene.apache.org/solr/books.html . The patch has updates to html/text files and there is a binary image file as well. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5098) Broadword bit selection
[ https://issues.apache.org/jira/browse/LUCENE-5098?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707037#comment-13707037 ] Paul Elschot commented on LUCENE-5098: -- In the naive implementation above the first shift is before the first masked test. Does this miss the lowest bit? Not that it matters much... I'll provide a new patch with: * rank9 package private, users of IBM's J9 will know what to do, * longHex in ToStringUtils, and * extends LuceneTestCase. Broadword bit selection --- Key: LUCENE-5098 URL: https://issues.apache.org/jira/browse/LUCENE-5098 Project: Lucene - Core Issue Type: Improvement Components: core/other Reporter: Paul Elschot Assignee: Adrien Grand Priority: Minor Attachments: LUCENE-5098.patch -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-5001) Add new Solr book to the book list and slideshow
[ https://issues.apache.org/jira/browse/SOLR-5001?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707045#comment-13707045 ] Alexandre Rafalovitch commented on SOLR-5001: - In the scrolling part, right? I assume there is only layout space for two, but could not test (no easy build instructions). So, I took the oldest one out. If it fits all three, that would be the best option. If that's controversial, feel free to ignore my book in the scrolling part and just use the patch for the book list. Add new Solr book to the book list and slideshow Key: SOLR-5001 URL: https://issues.apache.org/jira/browse/SOLR-5001 Project: Solr Issue Type: Improvement Components: documentation Affects Versions: 4.3.1 Environment: https://lucene.apache.org/solr/ Reporter: Alexandre Rafalovitch Assignee: Steve Rowe Priority: Minor Fix For: 4.4 Attachments: book_s4index.jpg, SOLR-5001.patch A new Solr book came out from Packt. I am providing the patch to update the website pages corresponding to slideshow on https://lucene.apache.org/solr/ and https://lucene.apache.org/solr/books.html . The patch has updates to html/text files and there is a binary image file as well. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (LUCENE-5098) Broadword bit selection
[ https://issues.apache.org/jira/browse/LUCENE-5098?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Paul Elschot updated LUCENE-5098: - Attachment: LUCENE-5098.patch Broadword bit selection --- Key: LUCENE-5098 URL: https://issues.apache.org/jira/browse/LUCENE-5098 Project: Lucene - Core Issue Type: Improvement Components: core/other Reporter: Paul Elschot Assignee: Adrien Grand Priority: Minor Attachments: LUCENE-5098.patch, LUCENE-5098.patch -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-5001) Add new Solr book to the book list and slideshow
[ https://issues.apache.org/jira/browse/SOLR-5001?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707070#comment-13707070 ] Steve Rowe commented on SOLR-5001: -- bq. In the scrolling part, right? I assume there is only layout space for two, but could not test (no easy build instructions). There is theoretically a way to build locally, but I've never used it: [http://www.apache.org/dev/cmsref.html#faq-build-tools] I'll see if I can get all three book images to be displayed simultaneously - looks like a fixed-width format, with room to spare on the right (in OS X Safari anyway). Add new Solr book to the book list and slideshow Key: SOLR-5001 URL: https://issues.apache.org/jira/browse/SOLR-5001 Project: Solr Issue Type: Improvement Components: documentation Affects Versions: 4.3.1 Environment: https://lucene.apache.org/solr/ Reporter: Alexandre Rafalovitch Assignee: Steve Rowe Priority: Minor Fix For: 4.4 Attachments: book_s4index.jpg, SOLR-5001.patch A new Solr book came out from Packt. I am providing the patch to update the website pages corresponding to slideshow on https://lucene.apache.org/solr/ and https://lucene.apache.org/solr/books.html . The patch has updates to html/text files and there is a binary image file as well. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (LUCENE-5107) Convert all Properties#store() and load() to use UTF-8 charset
[ https://issues.apache.org/jira/browse/LUCENE-5107?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Uwe Schindler updated LUCENE-5107: -- Attachment: LUCENE-5107.patch Here is the patch which preserves full backwards compatibility with properties files written by earlier solr/lucene versions. But it now allows to put UTF-8 directly into properties files and it no longer \u-escapes stuff when writing out. Convert all Properties#store() and load() to use UTF-8 charset -- Key: LUCENE-5107 URL: https://issues.apache.org/jira/browse/LUCENE-5107 Project: Lucene - Core Issue Type: Task Affects Versions: 4.4 Reporter: Uwe Schindler Assignee: Uwe Schindler Priority: Blocker Fix For: 4.5 Attachments: LUCENE-5107.patch Followup of LUCENE-5106: This needs to be changed and the forbidden signatures changed to disallow InputStream/OutputStream and allow Reader/Writer only. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (LUCENE-5107) Convert all Properties#store() and load() to use UTF-8 charset
[ https://issues.apache.org/jira/browse/LUCENE-5107?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Uwe Schindler updated LUCENE-5107: -- Affects Version/s: (was: 4.4) Convert all Properties#store() and load() to use UTF-8 charset -- Key: LUCENE-5107 URL: https://issues.apache.org/jira/browse/LUCENE-5107 Project: Lucene - Core Issue Type: Task Reporter: Uwe Schindler Assignee: Uwe Schindler Priority: Blocker Fix For: 4.5 Attachments: LUCENE-5107.patch Followup of LUCENE-5106: This needs to be changed and the forbidden signatures changed to disallow InputStream/OutputStream and allow Reader/Writer only. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5106) unban properties with unicode escapes
[ https://issues.apache.org/jira/browse/LUCENE-5106?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707081#comment-13707081 ] Uwe Schindler commented on LUCENE-5106: --- I Robert, to make us both happy - I would suggest to commit the patch from LUCENE-5107. It uses UTF-8 and when backported to Lucene 4.4, we are both happy: - You don't need to ban UTF-8 in LuSolr 4.4 - And I have safety that loading/storing properties is consistent. LUCENE-5107 does not bring any compatibility issues, because properties files written using OutputStream are ASCII-only (Java escapes everything 127), so can be loaded as UTF-8 easily. unban properties with unicode escapes - Key: LUCENE-5106 URL: https://issues.apache.org/jira/browse/LUCENE-5106 Project: Lucene - Core Issue Type: Bug Affects Versions: 4.4 Reporter: Robert Muir Assignee: Uwe Schindler Priority: Blocker Attachments: LUCENE-5106.patch As discussed on the mailing list, its just wrong to ban the use of unicode here. This blocks 4.4 (because it was committed there, too) -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-5001) Add new Solr book to the book list and slideshow
[ https://issues.apache.org/jira/browse/SOLR-5001?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707084#comment-13707084 ] ASF subversion and git services commented on SOLR-5001: --- Commit 1502610 from [~steve_rowe] [ https://svn.apache.org/r1502610 ] SOLR-5001: add book - try putting three images on the book slide Add new Solr book to the book list and slideshow Key: SOLR-5001 URL: https://issues.apache.org/jira/browse/SOLR-5001 Project: Solr Issue Type: Improvement Components: documentation Affects Versions: 4.3.1 Environment: https://lucene.apache.org/solr/ Reporter: Alexandre Rafalovitch Assignee: Steve Rowe Priority: Minor Fix For: 4.4 Attachments: book_s4index.jpg, SOLR-5001.patch A new Solr book came out from Packt. I am providing the patch to update the website pages corresponding to slideshow on https://lucene.apache.org/solr/ and https://lucene.apache.org/solr/books.html . The patch has updates to html/text files and there is a binary image file as well. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-5001) Add new Solr book to the book list and slideshow
[ https://issues.apache.org/jira/browse/SOLR-5001?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707090#comment-13707090 ] Steve Rowe commented on SOLR-5001: -- All three books on the book slide looks fine to me - you can see the result on the staging site: [http://lucene.staging.apache.org/solr/index.html]. I'll try looking at it with a few different browsers to see if it's a problem. Add new Solr book to the book list and slideshow Key: SOLR-5001 URL: https://issues.apache.org/jira/browse/SOLR-5001 Project: Solr Issue Type: Improvement Components: documentation Affects Versions: 4.3.1 Environment: https://lucene.apache.org/solr/ Reporter: Alexandre Rafalovitch Assignee: Steve Rowe Priority: Minor Fix For: 4.4 Attachments: book_s4index.jpg, SOLR-5001.patch A new Solr book came out from Packt. I am providing the patch to update the website pages corresponding to slideshow on https://lucene.apache.org/solr/ and https://lucene.apache.org/solr/books.html . The patch has updates to html/text files and there is a binary image file as well. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5107) Convert all Properties#store() and load() to use UTF-8 charset
[ https://issues.apache.org/jira/browse/LUCENE-5107?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707092#comment-13707092 ] ASF subversion and git services commented on LUCENE-5107: - Commit 1502615 from [~thetaphi] [ https://svn.apache.org/r1502615 ] LUCENE-5107: Properties files by Lucene are now written in UTF-8 encoding, Unicode is no longer escaped. Reading of legacy properties files with \u escapes is still possible Convert all Properties#store() and load() to use UTF-8 charset -- Key: LUCENE-5107 URL: https://issues.apache.org/jira/browse/LUCENE-5107 Project: Lucene - Core Issue Type: Task Reporter: Uwe Schindler Assignee: Uwe Schindler Priority: Blocker Fix For: 4.5 Attachments: LUCENE-5107.patch Followup of LUCENE-5106: This needs to be changed and the forbidden signatures changed to disallow InputStream/OutputStream and allow Reader/Writer only. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5107) Convert all Properties#store() and load() to use UTF-8 charset
[ https://issues.apache.org/jira/browse/LUCENE-5107?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707096#comment-13707096 ] ASF subversion and git services commented on LUCENE-5107: - Commit 1502622 from [~thetaphi] [ https://svn.apache.org/r1502622 ] Merged revision(s) 1502615 from lucene/dev/trunk: LUCENE-5107: Properties files by Lucene are now written in UTF-8 encoding, Unicode is no longer escaped. Reading of legacy properties files with \u escapes is still possible Convert all Properties#store() and load() to use UTF-8 charset -- Key: LUCENE-5107 URL: https://issues.apache.org/jira/browse/LUCENE-5107 Project: Lucene - Core Issue Type: Task Reporter: Uwe Schindler Assignee: Uwe Schindler Priority: Blocker Fix For: 4.5 Attachments: LUCENE-5107.patch Followup of LUCENE-5106: This needs to be changed and the forbidden signatures changed to disallow InputStream/OutputStream and allow Reader/Writer only. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-5001) Add new Solr book to the book list and slideshow
[ https://issues.apache.org/jira/browse/SOLR-5001?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707105#comment-13707105 ] Alexandre Rafalovitch commented on SOLR-5001: - Looks good on Windows 7 (IE, Chrome, Firefox). Actually IE (IE 10) has transition problems (overlapping images), but that does not seem to be related to this JIRA. Thank you for getting this done so fast. Add new Solr book to the book list and slideshow Key: SOLR-5001 URL: https://issues.apache.org/jira/browse/SOLR-5001 Project: Solr Issue Type: Improvement Components: documentation Affects Versions: 4.3.1 Environment: https://lucene.apache.org/solr/ Reporter: Alexandre Rafalovitch Assignee: Steve Rowe Priority: Minor Fix For: 4.4 Attachments: book_s4index.jpg, SOLR-5001.patch A new Solr book came out from Packt. I am providing the patch to update the website pages corresponding to slideshow on https://lucene.apache.org/solr/ and https://lucene.apache.org/solr/books.html . The patch has updates to html/text files and there is a binary image file as well. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-5001) Add new Solr book to the book list and slideshow
[ https://issues.apache.org/jira/browse/SOLR-5001?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707108#comment-13707108 ] Steve Rowe commented on SOLR-5001: -- bq. Looks good on Windows 7 (IE, Chrome, Firefox). Looks fine for me too in IE 10 and Firefox 22 (both on Windows 7). bq. Thank you for getting this done so fast. No problem, sorry about the delay getting this published. Publishing shortly. Add new Solr book to the book list and slideshow Key: SOLR-5001 URL: https://issues.apache.org/jira/browse/SOLR-5001 Project: Solr Issue Type: Improvement Components: documentation Affects Versions: 4.3.1 Environment: https://lucene.apache.org/solr/ Reporter: Alexandre Rafalovitch Assignee: Steve Rowe Priority: Minor Fix For: 4.4 Attachments: book_s4index.jpg, SOLR-5001.patch A new Solr book came out from Packt. I am providing the patch to update the website pages corresponding to slideshow on https://lucene.apache.org/solr/ and https://lucene.apache.org/solr/books.html . The patch has updates to html/text files and there is a binary image file as well. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Resolved] (SOLR-5001) Add new Solr book to the book list and slideshow
[ https://issues.apache.org/jira/browse/SOLR-5001?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Steve Rowe resolved SOLR-5001. -- Resolution: Implemented Published. For a couple minutes after I published, the book cover image wasn't downloadable from production, though the rest of the site had changed, but now it is. Weird that the change wasn't atomic. Add new Solr book to the book list and slideshow Key: SOLR-5001 URL: https://issues.apache.org/jira/browse/SOLR-5001 Project: Solr Issue Type: Improvement Components: documentation Affects Versions: 4.3.1 Environment: https://lucene.apache.org/solr/ Reporter: Alexandre Rafalovitch Assignee: Steve Rowe Priority: Minor Fix For: 4.4 Attachments: book_s4index.jpg, SOLR-5001.patch A new Solr book came out from Packt. I am providing the patch to update the website pages corresponding to slideshow on https://lucene.apache.org/solr/ and https://lucene.apache.org/solr/books.html . The patch has updates to html/text files and there is a binary image file as well. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (LUCENE-5107) Convert all Properties#store() and load() to use UTF-8 charset
[ https://issues.apache.org/jira/browse/LUCENE-5107?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Uwe Schindler updated LUCENE-5107: -- Fix Version/s: (was: 4.5) 4.4 Convert all Properties#store() and load() to use UTF-8 charset -- Key: LUCENE-5107 URL: https://issues.apache.org/jira/browse/LUCENE-5107 Project: Lucene - Core Issue Type: Task Reporter: Uwe Schindler Assignee: Uwe Schindler Priority: Blocker Fix For: 4.4 Attachments: LUCENE-5107.patch Followup of LUCENE-5106: This needs to be changed and the forbidden signatures changed to disallow InputStream/OutputStream and allow Reader/Writer only. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (LUCENE-5107) Convert all Properties#store() and load() to use UTF-8 charset
[ https://issues.apache.org/jira/browse/LUCENE-5107?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Uwe Schindler updated LUCENE-5107: -- Attachment: LUCENE-5107-4.4.patch Patch for 4.4 (as code in Solr is little different). Convert all Properties#store() and load() to use UTF-8 charset -- Key: LUCENE-5107 URL: https://issues.apache.org/jira/browse/LUCENE-5107 Project: Lucene - Core Issue Type: Task Reporter: Uwe Schindler Assignee: Uwe Schindler Priority: Blocker Fix For: 4.4 Attachments: LUCENE-5107-4.4.patch, LUCENE-5107.patch Followup of LUCENE-5106: This needs to be changed and the forbidden signatures changed to disallow InputStream/OutputStream and allow Reader/Writer only. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5107) Convert all Properties#store() and load() to use UTF-8 charset
[ https://issues.apache.org/jira/browse/LUCENE-5107?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707124#comment-13707124 ] ASF subversion and git services commented on LUCENE-5107: - Commit 1502632 from [~thetaphi] [ https://svn.apache.org/r1502632 ] Merged revision(s) 1502615 from lucene/dev/trunk: LUCENE-5107: Properties files by Lucene are now written in UTF-8 encoding, Unicode is no longer escaped. Reading of legacy properties files with \u escapes is still possible Convert all Properties#store() and load() to use UTF-8 charset -- Key: LUCENE-5107 URL: https://issues.apache.org/jira/browse/LUCENE-5107 Project: Lucene - Core Issue Type: Task Reporter: Uwe Schindler Assignee: Uwe Schindler Priority: Blocker Fix For: 4.4 Attachments: LUCENE-5107-4.4.patch, LUCENE-5107.patch Followup of LUCENE-5106: This needs to be changed and the forbidden signatures changed to disallow InputStream/OutputStream and allow Reader/Writer only. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Resolved] (LUCENE-5107) Convert all Properties#store() and load() to use UTF-8 charset
[ https://issues.apache.org/jira/browse/LUCENE-5107?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Uwe Schindler resolved LUCENE-5107. --- Resolution: Fixed Committed to trunk, 4.x and 4.4 Convert all Properties#store() and load() to use UTF-8 charset -- Key: LUCENE-5107 URL: https://issues.apache.org/jira/browse/LUCENE-5107 Project: Lucene - Core Issue Type: Task Reporter: Uwe Schindler Assignee: Uwe Schindler Priority: Blocker Fix For: 4.4 Attachments: LUCENE-5107-4.4.patch, LUCENE-5107.patch Followup of LUCENE-5106: This needs to be changed and the forbidden signatures changed to disallow InputStream/OutputStream and allow Reader/Writer only. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Resolved] (LUCENE-5106) unban properties with unicode escapes
[ https://issues.apache.org/jira/browse/LUCENE-5106?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Uwe Schindler resolved LUCENE-5106. --- Resolution: Fixed Fixed through LUCENE-5107. Sorry Robert - as always it was funny to fight to find finally a good solution! :-) unban properties with unicode escapes - Key: LUCENE-5106 URL: https://issues.apache.org/jira/browse/LUCENE-5106 Project: Lucene - Core Issue Type: Bug Affects Versions: 4.4 Reporter: Robert Muir Assignee: Uwe Schindler Priority: Blocker Attachments: LUCENE-5106.patch As discussed on the mailing list, its just wrong to ban the use of unicode here. This blocks 4.4 (because it was committed there, too) -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (SOLR-5027) CollapsingQParserPlugin
[ https://issues.apache.org/jira/browse/SOLR-5027?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Joel Bernstein updated SOLR-5027: - Attachment: SOLR-5027.patch Added small initial test case. CollapsingQParserPlugin --- Key: SOLR-5027 URL: https://issues.apache.org/jira/browse/SOLR-5027 Project: Solr Issue Type: New Feature Components: search Affects Versions: 5.0 Reporter: Joel Bernstein Priority: Minor Attachments: SOLR-5027.patch, SOLR-5027.patch The CollapsingQParserPlugin is a PostFilter that performs field collapsing. This allows field collapsing to be done within the normal search flow. Initial syntax: fq=(!collapse field=field_name} All documents in a group will be collapsed to the highest ranking document in the group. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
Re: Request for Mentor for LUCENE-2562 : Make Luke a Lucene/Solr Module
Great! I'd be glad to help get you going! I think that issue would have a very large affect if we could get it up to speed - it's a very popular tool and having it released with Lucene and always up to date with trunk would really help a lot of people I think. - Mark On Fri, Jul 12, 2013 at 1:13 AM, Ajay Bhat a.ajay.b...@gmail.com wrote: Hi all, I had attended the ASF-ICFOSS Mentoring Programme [1], 2013 in India conducted by Luciano Resende[2] and I've developed an interest in open source. I'd like to work on the JIRA Lucene 2562 - Make Luke a Lucene/Solr Module [3] as a project. I've checked out the Lucene trunk and taken a look at the work Mark Miller has done on it. If no one has taken it up, I'm willing to work on it after submitting a formal project proposal after I get a mentor. I'd like some technical guidance on Lucene from the community as well. [1] http://community.apache.org/mentoringprogramme-icfoss-pilot.html [2] http://people.apache.org/~lresende [3] https://issues.apache.org/jira/browse/LUCENE-2562 Thanks and regards, Ajay -- - Mark
[jira] [Commented] (LUCENE-5098) Broadword bit selection
[ https://issues.apache.org/jira/browse/LUCENE-5098?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707191#comment-13707191 ] Dawid Weiss commented on LUCENE-5098: - Indeed, it's a regression from an earlier implementation where I used Long.rotateRight. It won't play a difference but well spotted. Broadword bit selection --- Key: LUCENE-5098 URL: https://issues.apache.org/jira/browse/LUCENE-5098 Project: Lucene - Core Issue Type: Improvement Components: core/other Reporter: Paul Elschot Assignee: Adrien Grand Priority: Minor Attachments: LUCENE-5098.patch, LUCENE-5098.patch -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Created] (LUCENE-5108) poll-mirrors.pl should let you check multiple paths on each mirror
Hoss Man created LUCENE-5108: Summary: poll-mirrors.pl should let you check multiple paths on each mirror Key: LUCENE-5108 URL: https://issues.apache.org/jira/browse/LUCENE-5108 Project: Lucene - Core Issue Type: Improvement Reporter: Hoss Man Idea spun off from LUCENE-5104 based on sarowe's experience dealing with mirrors that are only partially updated (this is all very doable, but requires some major rewriting of hte logic flow in the script, so i wanted to break it out into it's own issue)... {quote} There's a separate (but maybe related to what you want to do here) issue with poll-mirrors.pl - when Shalin did the 4.3.1 release, he didn't upload all of the artifacts at once, and as a result, the script reported that the release was on all mirrors, even though some parts weren't there yet, rendering the information useless. Maybe the script could take one or more suffixes, so that it could find any number of things on each mirror, and report how many mirrors have *all* of them? {quote} ... {quote} bq. does it really matter what percentage have X and Y? or just what percentage have X? what percentage have Y? The way I've used that script, the question has been: Can I announce that the release is available? This is answered when all parts of the release are downloadable from some threshold percentage of mirrors, thus what percentage have X AND Y. As you say, though, this could be performed by running the script in multiple terminals with different paths. One goal of the script, though, was having just one place to go to get the answer to the question (thus lumping Maven in there too). Maybe the script could be (eventually - shouldn't block the nice changes you've made here) changed to allow multiple -path options, and print a number instead of a . for presence or X for absence, representing how many of the files are downloadable at each mirror: 0, 3, etc. {quote} -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5104) poll-mirrors.pl needs fixed
[ https://issues.apache.org/jira/browse/LUCENE-5104?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707198#comment-13707198 ] Hoss Man commented on LUCENE-5104: -- bq. One thing I noticed: your $usage includes -V but doesn't mention -details - I'm guessing you renamed the option but didn't change the $usage? Good eye ... yeah, originally i named -details -Verbose but the way we are using GetOpt it does case insensitive short form args, so trying to use -v 4.3.1 would error that -v was too vague (it didn't know if you wanted -verbose or -version) and i was too lazy to completely revamp the arg parsing to use something more sophisticated. bq. The way I've used that script, the question has been: Can I announce that the release is available? This is answered when all parts of the release are downloadable ... I hear you, and i have some ideas of how to make it work better for what you're describing, but it means gutting most of how poll-mirrors.pl works now, so i split that off into LUCENE-5108 poll-mirrors.pl needs fixed --- Key: LUCENE-5104 URL: https://issues.apache.org/jira/browse/LUCENE-5104 Project: Lucene - Core Issue Type: Bug Reporter: Hoss Man Fix For: 4.4 Attachments: LUCENE-5104.patch, LUCENE-5104.patch i just noticed that poll-mirrors.pl is setup to look for the KEYS file in the release dir on each mirror -- Infra (wisely) tweaked the way mirroring happens recently to ensure that KEYS files are *not* mirrored anymore (presumably to help catch bad links advising people to download untrusted KEYS files) we're going to need to updated poll-mirrors.pl to look for something else in each release dir ... changes/Changes.html perhaps? -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5104) poll-mirrors.pl needs fixed
[ https://issues.apache.org/jira/browse/LUCENE-5104?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707207#comment-13707207 ] Steve Rowe commented on LUCENE-5104: bq. it means gutting most of how poll-mirrors.pl works now, so i split that off into LUCENE-5108 Thanks Hoss. poll-mirrors.pl needs fixed --- Key: LUCENE-5104 URL: https://issues.apache.org/jira/browse/LUCENE-5104 Project: Lucene - Core Issue Type: Bug Reporter: Hoss Man Fix For: 4.4 Attachments: LUCENE-5104.patch, LUCENE-5104.patch i just noticed that poll-mirrors.pl is setup to look for the KEYS file in the release dir on each mirror -- Infra (wisely) tweaked the way mirroring happens recently to ensure that KEYS files are *not* mirrored anymore (presumably to help catch bad links advising people to download untrusted KEYS files) we're going to need to updated poll-mirrors.pl to look for something else in each release dir ... changes/Changes.html perhaps? -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-4888) setup admin edit perms for PMC/committers in SOLR CWIKI
[ https://issues.apache.org/jira/browse/SOLR-4888?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707208#comment-13707208 ] Hoss Man commented on SOLR-4888: I've added dsmiley to solr-committers and solr-admins setup admin edit perms for PMC/committers in SOLR CWIKI - Key: SOLR-4888 URL: https://issues.apache.org/jira/browse/SOLR-4888 Project: Solr Issue Type: Sub-task Components: documentation Reporter: Hoss Man Assignee: Hoss Man In order to ramp up on maintaining the newly donated Solr Ref Guide in confluence, we need to ramp up on creating confluence accounts for committers and getting them in hte approprate confluence groups so they have edit permissions... For Committers... * If you are interested in helping to maintain the [Solr Reference Guide|https://cwiki.apache.org/confluence/display/solr/] you need to [Create a CWIKI Account|https://cwiki.apache.org/confluence/signup.action] * Once you have an account, please post your account name as a comment in this jira issue. * if you are a PMC member and you are willing to volunteer to [help maintain CWIKI|https://cwiki.apache.org/confluence/display/INFRA/Cwiki#Cwiki-StandardGroups] (notably: managing groups of users) please mention in your jira comment that you are willing to be a confluence admin For Current Confluence Admins... * All committers who volunteer should be added to the [solr-committers|https://cwiki.apache.org/confluence/admin/users/domembersofgroupsearch.action?membersOfGroupTerm=solr-committers] confluence group (which had edit permissions for the solr space) * all volunteers who are also PMC members should be also added to the [solr-admins|https://cwiki.apache.org/confluence/admin/users/domembersofgroupsearch.action?membersOfGroupTerm=solr-admins] confluence group (which also has space admin permissions for the solr space) * volunteers who are PMC members who are also volunteering to be confluence admins should be also added to the [confluence-administrators|https://cwiki.apache.org/confluence/admin/users/domembersofgroupsearch.action?membersOfGroupTerm=confluence-administrators] group The ongoing process of dealing with this has now been documented: https://cwiki.apache.org/confluence/display/solr/Internal+-+CWIKI+ACLs -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (SOLR-4888) setup admin edit perms for PMC/committers in SOLR CWIKI
[ https://issues.apache.org/jira/browse/SOLR-4888?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707211#comment-13707211 ] Hoss Man commented on SOLR-4888: Now that: * we have a non-trivial number of PMC members as confluence admins who can handle ACL requests * the wiki is up and running and we're in ongoing mode (not initial setup mode) * the ACL process has been documented: https://cwiki.apache.org/confluence/display/solr/Internal+-+CWIKI+ACLs ...commenting in this jira to request access is retired ... let's move on with the ongoing process of dealing with this [as documented|https://cwiki.apache.org/confluence/display/solr/Internal+-+CWIKI+ACLs] setup admin edit perms for PMC/committers in SOLR CWIKI - Key: SOLR-4888 URL: https://issues.apache.org/jira/browse/SOLR-4888 Project: Solr Issue Type: Sub-task Components: documentation Reporter: Hoss Man Assignee: Hoss Man In order to ramp up on maintaining the newly donated Solr Ref Guide in confluence, we need to ramp up on creating confluence accounts for committers and getting them in hte approprate confluence groups so they have edit permissions... For Committers... * If you are interested in helping to maintain the [Solr Reference Guide|https://cwiki.apache.org/confluence/display/solr/] you need to [Create a CWIKI Account|https://cwiki.apache.org/confluence/signup.action] * Once you have an account, please post your account name as a comment in this jira issue. * if you are a PMC member and you are willing to volunteer to [help maintain CWIKI|https://cwiki.apache.org/confluence/display/INFRA/Cwiki#Cwiki-StandardGroups] (notably: managing groups of users) please mention in your jira comment that you are willing to be a confluence admin For Current Confluence Admins... * All committers who volunteer should be added to the [solr-committers|https://cwiki.apache.org/confluence/admin/users/domembersofgroupsearch.action?membersOfGroupTerm=solr-committers] confluence group (which had edit permissions for the solr space) * all volunteers who are also PMC members should be also added to the [solr-admins|https://cwiki.apache.org/confluence/admin/users/domembersofgroupsearch.action?membersOfGroupTerm=solr-admins] confluence group (which also has space admin permissions for the solr space) * volunteers who are PMC members who are also volunteering to be confluence admins should be also added to the [confluence-administrators|https://cwiki.apache.org/confluence/admin/users/domembersofgroupsearch.action?membersOfGroupTerm=confluence-administrators] group The ongoing process of dealing with this has now been documented: https://cwiki.apache.org/confluence/display/solr/Internal+-+CWIKI+ACLs -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5104) poll-mirrors.pl needs fixed
[ https://issues.apache.org/jira/browse/LUCENE-5104?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707214#comment-13707214 ] ASF subversion and git services commented on LUCENE-5104: - Commit 1502646 from hoss...@apache.org [ https://svn.apache.org/r1502646 ] LUCENE-5104: fix poll-mirrors.pl: a) use HEAD; b) look for Changes.html instead of KEYS; c) support looking for arbitrary paths poll-mirrors.pl needs fixed --- Key: LUCENE-5104 URL: https://issues.apache.org/jira/browse/LUCENE-5104 Project: Lucene - Core Issue Type: Bug Reporter: Hoss Man Fix For: 4.4 Attachments: LUCENE-5104.patch, LUCENE-5104.patch i just noticed that poll-mirrors.pl is setup to look for the KEYS file in the release dir on each mirror -- Infra (wisely) tweaked the way mirroring happens recently to ensure that KEYS files are *not* mirrored anymore (presumably to help catch bad links advising people to download untrusted KEYS files) we're going to need to updated poll-mirrors.pl to look for something else in each release dir ... changes/Changes.html perhaps? -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5104) poll-mirrors.pl needs fixed
[ https://issues.apache.org/jira/browse/LUCENE-5104?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707217#comment-13707217 ] ASF subversion and git services commented on LUCENE-5104: - Commit 1502649 from hoss...@apache.org [ https://svn.apache.org/r1502649 ] LUCENE-5104: fix poll-mirrors.pl: a) use HEAD; b) look for Changes.html instead of KEYS; c) support looking for arbitrary paths (merge r1502646) poll-mirrors.pl needs fixed --- Key: LUCENE-5104 URL: https://issues.apache.org/jira/browse/LUCENE-5104 Project: Lucene - Core Issue Type: Bug Reporter: Hoss Man Fix For: 4.4 Attachments: LUCENE-5104.patch, LUCENE-5104.patch i just noticed that poll-mirrors.pl is setup to look for the KEYS file in the release dir on each mirror -- Infra (wisely) tweaked the way mirroring happens recently to ensure that KEYS files are *not* mirrored anymore (presumably to help catch bad links advising people to download untrusted KEYS files) we're going to need to updated poll-mirrors.pl to look for something else in each release dir ... changes/Changes.html perhaps? -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Resolved] (LUCENE-5104) poll-mirrors.pl needs fixed
[ https://issues.apache.org/jira/browse/LUCENE-5104?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hoss Man resolved LUCENE-5104. -- Resolution: Fixed Fix Version/s: 4.5 5.0 Assignee: Hoss Man Committed revision 1502646. Committed revision 1502647. Committed revision 1502649. poll-mirrors.pl needs fixed --- Key: LUCENE-5104 URL: https://issues.apache.org/jira/browse/LUCENE-5104 Project: Lucene - Core Issue Type: Bug Reporter: Hoss Man Assignee: Hoss Man Fix For: 5.0, 4.4, 4.5 Attachments: LUCENE-5104.patch, LUCENE-5104.patch i just noticed that poll-mirrors.pl is setup to look for the KEYS file in the release dir on each mirror -- Infra (wisely) tweaked the way mirroring happens recently to ensure that KEYS files are *not* mirrored anymore (presumably to help catch bad links advising people to download untrusted KEYS files) we're going to need to updated poll-mirrors.pl to look for something else in each release dir ... changes/Changes.html perhaps? -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5104) poll-mirrors.pl needs fixed
[ https://issues.apache.org/jira/browse/LUCENE-5104?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707216#comment-13707216 ] ASF subversion and git services commented on LUCENE-5104: - Commit 1502647 from hoss...@apache.org [ https://svn.apache.org/r1502647 ] LUCENE-5104: fix poll-mirrors.pl: a) use HEAD; b) look for Changes.html instead of KEYS; c) support looking for arbitrary paths (merge r1502646) poll-mirrors.pl needs fixed --- Key: LUCENE-5104 URL: https://issues.apache.org/jira/browse/LUCENE-5104 Project: Lucene - Core Issue Type: Bug Reporter: Hoss Man Fix For: 4.4 Attachments: LUCENE-5104.patch, LUCENE-5104.patch i just noticed that poll-mirrors.pl is setup to look for the KEYS file in the release dir on each mirror -- Infra (wisely) tweaked the way mirroring happens recently to ensure that KEYS files are *not* mirrored anymore (presumably to help catch bad links advising people to download untrusted KEYS files) we're going to need to updated poll-mirrors.pl to look for something else in each release dir ... changes/Changes.html perhaps? -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-Tests-4.x-Java6 - Build # 1788 - Failure
Build: https://builds.apache.org/job/Lucene-Solr-Tests-4.x-Java6/1788/ 2 tests failed. FAILED: junit.framework.TestSuite.org.apache.solr.cloud.BasicDistributedZkTest Error Message: 1 thread leaked from SUITE scope at org.apache.solr.cloud.BasicDistributedZkTest: 1) Thread[id=3120, name=recoveryCmdExecutor-1457-thread-1, state=RUNNABLE, group=TGRP-BasicDistributedZkTest] at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:327) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:193) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:180) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:384) at java.net.Socket.connect(Socket.java:546) at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:127) at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:180) at org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:294) at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:645) at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:480) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:805) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:784) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:365) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:180) at org.apache.solr.cloud.SyncStrategy$1.run(SyncStrategy.java:291) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:679) Stack Trace: com.carrotsearch.randomizedtesting.ThreadLeakError: 1 thread leaked from SUITE scope at org.apache.solr.cloud.BasicDistributedZkTest: 1) Thread[id=3120, name=recoveryCmdExecutor-1457-thread-1, state=RUNNABLE, group=TGRP-BasicDistributedZkTest] at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:327) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:193) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:180) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:384) at java.net.Socket.connect(Socket.java:546) at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:127) at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:180) at org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:294) at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:645) at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:480) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:805) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:784) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:365) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:180) at org.apache.solr.cloud.SyncStrategy$1.run(SyncStrategy.java:291) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:679) at __randomizedtesting.SeedInfo.seed([742373F939F465B9]:0) FAILED: junit.framework.TestSuite.org.apache.solr.cloud.BasicDistributedZkTest Error Message: There are still zombie threads that couldn't be terminated:1) Thread[id=3120, name=recoveryCmdExecutor-1457-thread-1, state=RUNNABLE, group=TGRP-BasicDistributedZkTest] at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:327) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:193) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:180) at
[jira] [Updated] (LUCENE-2750) add Kamikaze 3.0.1 into Lucene
[ https://issues.apache.org/jira/browse/LUCENE-2750?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Adrien Grand updated LUCENE-2750: - Attachment: LUCENE-2750.patch I wrote an implementation of a PForDeltaDocIdSet based on the ones in Kamikaze and D. Lemire's JavaFastPFOR (both are licensed under the ASL 2.0). On the contrary to the original implementation, it uses FOR to encode exceptions (this was easier given that we already have lots of utility methods to pack integers). add Kamikaze 3.0.1 into Lucene -- Key: LUCENE-2750 URL: https://issues.apache.org/jira/browse/LUCENE-2750 Project: Lucene - Core Issue Type: Sub-task Components: modules/other Reporter: hao yan Assignee: Adrien Grand Attachments: LUCENE-2750.patch Original Estimate: 336h Remaining Estimate: 336h Kamikaze 3.0.1 is the updated version of Kamikaze 2.0.0. It can achieve significantly better performance then Kamikaze 2.0.0 in terms of both compressed size and decompression speed. The main difference between the two versions is Kamikaze 3.0.x uses the much more efficient implementation of the PForDelta compression algorithm. My goal is to integrate the highly efficient PForDelta implementation into Lucene Codec. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5098) Broadword bit selection
[ https://issues.apache.org/jira/browse/LUCENE-5098?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707253#comment-13707253 ] Dawid Weiss commented on LUCENE-5098: - Looks good to me (there's a likely unused import but it can be fixed later). Very cool, btw. Broadword bit selection --- Key: LUCENE-5098 URL: https://issues.apache.org/jira/browse/LUCENE-5098 Project: Lucene - Core Issue Type: Improvement Components: core/other Reporter: Paul Elschot Assignee: Adrien Grand Priority: Minor Attachments: LUCENE-5098.patch, LUCENE-5098.patch -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5101) make it easier to plugin different bitset implementations to CachingWrapperFilter
[ https://issues.apache.org/jira/browse/LUCENE-5101?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707268#comment-13707268 ] Adrien Grand commented on LUCENE-5101: -- A quick note about alternative DocIdSet we now have. I wrote a benchmark (attached) to see how they compared to FixedBitSet, you can look at the results here: http://people.apache.org/~jpountz/doc_id_sets.html Please note that EliasFanoDocIdSet is disadvantaged for advance() since it doesn't have an index yet, it will be interesting to run this benchmark again when it gets one. Maybe we could use these numbers to have better defaults in CWF? (and only use FixedBitSet for dense sets for example) make it easier to plugin different bitset implementations to CachingWrapperFilter - Key: LUCENE-5101 URL: https://issues.apache.org/jira/browse/LUCENE-5101 Project: Lucene - Core Issue Type: Improvement Reporter: Robert Muir Currently this is possible, but its not so friendly: {code} protected DocIdSet docIdSetToCache(DocIdSet docIdSet, AtomicReader reader) throws IOException { if (docIdSet == null) { // this is better than returning null, as the nonnull result can be cached return EMPTY_DOCIDSET; } else if (docIdSet.isCacheable()) { return docIdSet; } else { final DocIdSetIterator it = docIdSet.iterator(); // null is allowed to be returned by iterator(), // in this case we wrap with the sentinel set, // which is cacheable. if (it == null) { return EMPTY_DOCIDSET; } else { /* INTERESTING PART */ final FixedBitSet bits = new FixedBitSet(reader.maxDoc()); bits.or(it); return bits; /* END INTERESTING PART */ } } } {code} Is there any value to having all this other logic in the protected API? It seems like something thats not useful for a subclass... Maybe this stuff can become final, and INTERESTING PART calls a simpler method, something like: {code} protected DocIdSet cacheImpl(DocIdSetIterator iterator, AtomicReader reader) { final FixedBitSet bits = new FixedBitSet(reader.maxDoc()); bits.or(iterator); return bits; } {code} -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[JENKINS] Lucene-Solr-Tests-4.x-Java6 - Build # 1789 - Still Failing
Build: https://builds.apache.org/job/Lucene-Solr-Tests-4.x-Java6/1789/ 2 tests failed. FAILED: junit.framework.TestSuite.org.apache.solr.cloud.BasicDistributedZkTest Error Message: 1 thread leaked from SUITE scope at org.apache.solr.cloud.BasicDistributedZkTest: 1) Thread[id=1262, name=recoveryCmdExecutor-565-thread-1, state=RUNNABLE, group=TGRP-BasicDistributedZkTest] at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:327) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:193) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:180) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:384) at java.net.Socket.connect(Socket.java:546) at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:127) at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:180) at org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:294) at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:645) at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:480) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:805) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:784) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:365) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:180) at org.apache.solr.cloud.SyncStrategy$1.run(SyncStrategy.java:291) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:679) Stack Trace: com.carrotsearch.randomizedtesting.ThreadLeakError: 1 thread leaked from SUITE scope at org.apache.solr.cloud.BasicDistributedZkTest: 1) Thread[id=1262, name=recoveryCmdExecutor-565-thread-1, state=RUNNABLE, group=TGRP-BasicDistributedZkTest] at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:327) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:193) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:180) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:384) at java.net.Socket.connect(Socket.java:546) at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:127) at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:180) at org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:294) at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:645) at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:480) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:805) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:784) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:365) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:180) at org.apache.solr.cloud.SyncStrategy$1.run(SyncStrategy.java:291) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:679) at __randomizedtesting.SeedInfo.seed([C93D2BE45B15D47]:0) FAILED: junit.framework.TestSuite.org.apache.solr.cloud.BasicDistributedZkTest Error Message: There are still zombie threads that couldn't be terminated:1) Thread[id=1262, name=recoveryCmdExecutor-565-thread-1, state=RUNNABLE, group=TGRP-BasicDistributedZkTest] at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:327) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:193) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:180) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:384)
[jira] [Commented] (LUCENE-5098) Broadword bit selection
[ https://issues.apache.org/jira/browse/LUCENE-5098?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707372#comment-13707372 ] ASF subversion and git services commented on LUCENE-5098: - Commit 1502690 from [~jpountz] [ https://svn.apache.org/r1502690 ] LUCENE-5098: Broadword utility methods. Broadword bit selection --- Key: LUCENE-5098 URL: https://issues.apache.org/jira/browse/LUCENE-5098 Project: Lucene - Core Issue Type: Improvement Components: core/other Reporter: Paul Elschot Assignee: Adrien Grand Priority: Minor Attachments: LUCENE-5098.patch, LUCENE-5098.patch -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Created] (LUCENE-5109) EliasFano value index
Paul Elschot created LUCENE-5109: Summary: EliasFano value index Key: LUCENE-5109 URL: https://issues.apache.org/jira/browse/LUCENE-5109 Project: Lucene - Core Issue Type: Improvement Components: core/other Reporter: Paul Elschot Priority: Minor Index upper bits of Elias-Fano sequence. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5098) Broadword bit selection
[ https://issues.apache.org/jira/browse/LUCENE-5098?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707375#comment-13707375 ] ASF subversion and git services commented on LUCENE-5098: - Commit 1502691 from [~jpountz] [ https://svn.apache.org/r1502691 ] LUCENE-5098: Broadword utility methods (merged from r1502690). Broadword bit selection --- Key: LUCENE-5098 URL: https://issues.apache.org/jira/browse/LUCENE-5098 Project: Lucene - Core Issue Type: Improvement Components: core/other Reporter: Paul Elschot Assignee: Adrien Grand Priority: Minor Attachments: LUCENE-5098.patch, LUCENE-5098.patch -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (LUCENE-5109) EliasFano value index
[ https://issues.apache.org/jira/browse/LUCENE-5109?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Paul Elschot updated LUCENE-5109: - Attachment: LUCENE-5109.patch Mostly untested, not committable. EliasFano value index - Key: LUCENE-5109 URL: https://issues.apache.org/jira/browse/LUCENE-5109 Project: Lucene - Core Issue Type: Improvement Components: core/other Reporter: Paul Elschot Priority: Minor Attachments: LUCENE-5109.patch Index upper bits of Elias-Fano sequence. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5098) Broadword bit selection
[ https://issues.apache.org/jira/browse/LUCENE-5098?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707384#comment-13707384 ] Adrien Grand commented on LUCENE-5098: -- Committed. Thanks Paul and Dawid! Broadword bit selection --- Key: LUCENE-5098 URL: https://issues.apache.org/jira/browse/LUCENE-5098 Project: Lucene - Core Issue Type: Improvement Components: core/other Reporter: Paul Elschot Assignee: Adrien Grand Priority: Minor Attachments: LUCENE-5098.patch, LUCENE-5098.patch -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Updated] (LUCENE-5109) EliasFano value index
[ https://issues.apache.org/jira/browse/LUCENE-5109?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Adrien Grand updated LUCENE-5109: - Assignee: Adrien Grand EliasFano value index - Key: LUCENE-5109 URL: https://issues.apache.org/jira/browse/LUCENE-5109 Project: Lucene - Core Issue Type: Improvement Components: core/other Reporter: Paul Elschot Assignee: Adrien Grand Priority: Minor Attachments: LUCENE-5109.patch Index upper bits of Elias-Fano sequence. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5094) add a ramBytesUsed to OrdinalMap
[ https://issues.apache.org/jira/browse/LUCENE-5094?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707410#comment-13707410 ] ASF subversion and git services commented on LUCENE-5094: - Commit 1502697 from [~rcmuir] [ https://svn.apache.org/r1502697 ] LUCENE-5094: add ramBytesUsed to OrdinalMap add a ramBytesUsed to OrdinalMap Key: LUCENE-5094 URL: https://issues.apache.org/jira/browse/LUCENE-5094 Project: Lucene - Core Issue Type: Task Reporter: Robert Muir Attachments: LUCENE-5094.patch I think this would be useful. it could e.g. be exposed via SortedSetDocValuesReaderState and so on. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Commented] (LUCENE-5094) add a ramBytesUsed to OrdinalMap
[ https://issues.apache.org/jira/browse/LUCENE-5094?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13707413#comment-13707413 ] ASF subversion and git services commented on LUCENE-5094: - Commit 1502699 from [~rcmuir] [ https://svn.apache.org/r1502699 ] LUCENE-5094: add ramBytesUsed to OrdinalMap add a ramBytesUsed to OrdinalMap Key: LUCENE-5094 URL: https://issues.apache.org/jira/browse/LUCENE-5094 Project: Lucene - Core Issue Type: Task Reporter: Robert Muir Attachments: LUCENE-5094.patch I think this would be useful. it could e.g. be exposed via SortedSetDocValuesReaderState and so on. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org
[jira] [Resolved] (LUCENE-5094) add a ramBytesUsed to OrdinalMap
[ https://issues.apache.org/jira/browse/LUCENE-5094?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Robert Muir resolved LUCENE-5094. - Resolution: Fixed Fix Version/s: 4.5 5.0 add a ramBytesUsed to OrdinalMap Key: LUCENE-5094 URL: https://issues.apache.org/jira/browse/LUCENE-5094 Project: Lucene - Core Issue Type: Task Reporter: Robert Muir Fix For: 5.0, 4.5 Attachments: LUCENE-5094.patch I think this would be useful. it could e.g. be exposed via SortedSetDocValuesReaderState and so on. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira - To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org