Hi Alex, seems you¹re using an intermediate version of Kylin, in which the
org.apache.kylin.dict.TableColumnValueEnumerator has a recursive call. The
latest version has fixed this issue. Please pull the latest  and then make
a new build.

On 12/16/15, 12:36 PM, "Alex Mathew" <[email protected]> wrote:

>Hi,
>
> I am a newbie in kylin
>
> While I am building the cube I got an java.lang.StackOverflowError
>exception at the 3rd step(Build Dimension Dictionary).
> I got this error only when building cube with large table with inner
>join.
>
> I have changed the reducer maximum memmory size and maximum concurrent
>thread to lower level. But still the error.
>
> Somebody please help me.
>Thank  you
>
>
>
> My system configurations,
> KYLIN_JVM_SETTINGS is -Xms1024M -Xmx4096M -XX:MaxPermSize=128M
>
>kylin.job.mapreduce.default.reduce.input.mb=100
>kylin.job.concurrent.max.limit=7
>
>
>Error Log
>java.lang.StackOverflowError
>    at java.lang.ThreadLocal$ThreadLocalMap.getEntry(ThreadLocal.java:419)
>    at 
>java.lang.ThreadLocal$ThreadLocalMap.access$000(ThreadLocal.java:298)
>    at java.lang.ThreadLocal.get(ThreadLocal.java:163)
>    at org.apache.hadoop.io.Text.decode(Text.java:406)
>    at org.apache.hadoop.io.Text.decode(Text.java:389)
>    at
>org.apache.hadoop.hive.serde2.lazy.LazyUtils.convertToString(LazyUtils.jav
>a:126)
>    at
>org.apache.hadoop.hive.serde2.lazy.LazyInteger.parse(LazyInteger.java:151)
>    at
>org.apache.hadoop.hive.serde2.lazy.LazyInteger.parseInt(LazyInteger.java:1
>16)
>    at
>org.apache.hadoop.hive.serde2.lazy.LazyInteger.init(LazyInteger.java:55)
>    at
>org.apache.hadoop.hive.serde2.lazy.LazyStruct.uncheckedGetField(LazyStruct
>.java:226)
>    at
>org.apache.hadoop.hive.serde2.lazy.LazyStruct.getField(LazyStruct.java:202
>)
>    at
>org.apache.hadoop.hive.serde2.lazy.objectinspector.LazySimpleStructObjectI
>nspector.getStructFieldData(LazySimpleStructObjectInspector.java:128)
>    at
>org.apache.hive.hcatalog.data.LazyHCatRecord.get(LazyHCatRecord.java:53)
>    at
>org.apache.hive.hcatalog.data.LazyHCatRecord.get(LazyHCatRecord.java:97)
>    at
>org.apache.hive.hcatalog.mapreduce.HCatRecordReader.nextKeyValue(HCatRecor
>dReader.java:204)
>    at
>org.apache.hive.hcatalog.data.transfer.impl.HCatInputFormatReader$HCatReco
>rdItr.hasNext(HCatInputFormatReader.java:107)
>    at
>org.apache.kylin.dict.lookup.HiveTableReader.next(HiveTableReader.java:92)
>    at
>org.apache.kylin.dict.TableColumnValueEnumerator.moveNext(TableColumnValue
>Enumerator.java:46)
>    at
>org.apache.kylin.dict.TableColumnValueEnumerator.moveNext(TableColumnValue
>Enumerator.java:63)
>    at
>org.apache.kylin.dict.TableColumnValueEnumerator.moveNext(TableColumnValue
>Enumerator.java:63)
>    at
>org.apache.kylin.dict.TableColumnValueEnumerator.moveNext(TableColumnValue
>Enumerator.java:63)
>    at
>org.apache.kylin.dict.TableColumnValueEnumerator.moveNext(TableColumnValue
>Enumerator.java:63)
>    ...

Reply via email to