[
https://issues.apache.org/jira/browse/HBASE-12041?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14143884#comment-14143884
]
stack commented on HBASE-12041:
-------------------------------
I don't even get as far as you do [~jmspaggi] I get this:
{code}
kalashnikov:hbase.git stack$ ./bin/hbase --config ~/conf_hbase
org.apache.hadoop.hbase.HFilePerformanceEvaluation
2014-09-22 14:41:36.889 java[64802:1903] Unable to load realm info from
SCDynamicStore
2014-09-22 14:41:37,064 WARN [main] util.NativeCodeLoader
(NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for
your platform... using builtin-java classes where applicable
2014-09-22 14:41:37,343 INFO [main] hbase.HFilePerformanceEvaluation
(HFilePerformanceEvaluation.java:runBenchmark(119)) - Running
SequentialWriteBenchmark for 1000000 rows.
2014-09-22 14:41:37,697 INFO [main] hfile.CacheConfig
(CacheConfig.java:<init>(260)) - CacheConfig:disabled
Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException: 12346
at org.apache.hadoop.hbase.KeyValue.getFamilyLength(KeyValue.java:1350)
at org.apache.hadoop.hbase.KeyValue.getFamilyLength(KeyValue.java:1343)
at org.apache.hadoop.hbase.KeyValueUtil.keyLength(KeyValueUtil.java:71)
at
org.apache.hadoop.hbase.io.hfile.HFileWriterV2.append(HFileWriterV2.java:253)
at
org.apache.hadoop.hbase.io.hfile.HFileWriterV3.append(HFileWriterV3.java:88)
at
org.apache.hadoop.hbase.io.hfile.HFileWriterV3.append(HFileWriterV3.java:133)
at
org.apache.hadoop.hbase.io.hfile.HFileWriterV3.append(HFileWriterV3.java:106)
at
org.apache.hadoop.hbase.HFilePerformanceEvaluation$SequentialWriteBenchmark.doRow(HFilePerformanceEvaluation.java:203)
at
org.apache.hadoop.hbase.HFilePerformanceEvaluation$RowOrientedBenchmark.run(HFilePerformanceEvaluation.java:169)
at
org.apache.hadoop.hbase.HFilePerformanceEvaluation.runBenchmark(HFilePerformanceEvaluation.java:121)
at
org.apache.hadoop.hbase.HFilePerformanceEvaluation.runBenchmarks(HFilePerformanceEvaluation.java:72)
at
org.apache.hadoop.hbase.HFilePerformanceEvaluation.main(HFilePerformanceEvaluation.java:377)
{code}
Digging, its kinda ugly in that we pass from the test the 'key' and the 'value.
Down in the depths of HFileWriter, we try then to get stuff like family
lengths only there is no family length because no family in original bytes
passed -- just random row bytes. Down in HFileWriter we are trying to juggle
Cell and KeyValue. Presumption is we have a 'key' as we did in old days when
we did the KeyValue serialization.
> AssertionError in HFilePerformanceEvaluation.UniformRandomReadBenchmark
> -----------------------------------------------------------------------
>
> Key: HBASE-12041
> URL: https://issues.apache.org/jira/browse/HBASE-12041
> Project: HBase
> Issue Type: Bug
> Affects Versions: 0.99.1
> Reporter: Jean-Marc Spaggiari
> Assignee: stack
>
> {code}
> 2014-09-19 05:18:54,719 INFO [0] hbase.HFilePerformanceEvaluation: Running
> UniformRandomReadBenchmark for 1000000 rows.
> 2014-09-19 05:18:54,719 INFO [0] hfile.CacheConfig: CacheConfig:disabled
> Exception in thread "0" java.lang.AssertionError: Expected 0000472128 but got
> 0000472127
> at
> org.apache.hadoop.hbase.PerformanceEvaluationCommons.assertKey(PerformanceEvaluationCommons.java:50)
> at
> org.apache.hadoop.hbase.PerformanceEvaluationCommons.assertKey(PerformanceEvaluationCommons.java:45)
> at
> org.apache.hadoop.hbase.HFilePerformanceEvaluation$UniformRandomReadBenchmark.doRow(HFilePerformanceEvaluation.java:295)
> at
> org.apache.hadoop.hbase.HFilePerformanceEvaluation$RowOrientedBenchmark.run(HFilePerformanceEvaluation.java:169)
> at
> org.apache.hadoop.hbase.HFilePerformanceEvaluation.runBenchmark(HFilePerformanceEvaluation.java:121)
> at
> org.apache.hadoop.hbase.HFilePerformanceEvaluation$2.run(HFilePerformanceEvaluation.java:87)
> at java.lang.Thread.run(Thread.java:744)
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)