does anyone know that is there any stuff related to "HBase" itself
when ""Convert
Cuboid Data to HFile"?

2015-05-26 15:58 GMT+08:00 dong wang <[email protected]>:

> do anyone meet with the same problem with CDH 5.4.2 + kylin-0.7.1-staging
> source codes?
>
> 2015-05-26 13:54 GMT+08:00 dong wang <[email protected]>:
>
>> sorry, I mis-click the log information button, I will check the MR log
>> first
>>
>> 2015-05-26 13:51 GMT+08:00 dong wang <[email protected]>:
>>
>>> today, I update the environment, and when building the cube, the error
>>> looks the following:
>>> 2015-05-25 22:40:04.388 - State of Hadoop job:
>>> job_1432568508250_0142:ACCEPTED - UNDEFINED
>>> 2015-05-25 22:40:14.405 - State of Hadoop job:
>>> job_1432568508250_0142:RUNNING - UNDEFINED
>>> 2015-05-25 22:40:24.424 - State of Hadoop job:
>>> job_1432568508250_0142:RUNNING - UNDEFINED
>>> 2015-05-25 22:40:34.438 - State of Hadoop job:
>>> job_1432568508250_0142:RUNNING - UNDEFINED
>>> 2015-05-25 22:40:44.451 - State of Hadoop job:
>>> job_1432568508250_0142:RUNNING - UNDEFINED
>>> 2015-05-25 22:40:54.465 - State of Hadoop job:
>>> job_1432568508250_0142:FINISHED - FAILED
>>> no counters for job job_1432568508250_0142
>>>
>>>
>>> and when looking into the MR log, it says:
>>>
>>> Total Vmem allocated for Containers 29.40 GB
>>> Vmem enforcement enabled false
>>> Total Pmem allocated for Container 14 GB
>>> Pmem enforcement enabled true
>>> Total VCores allocated for Containers 8
>>> NodeHealthyStatus true
>>> LastNodeHealthTime Tue May 26 13:20:13 CST 2015
>>> NodeHealthReport
>>> Node Manager Version: 2.6.0-cdh5.4.2 from
>>> 15b703c8725733b7b2813d2325659eb7d57e7a3f by jenkins source checksum
>>> e7a085479aa1989b5cecfabea403549 on 2015-05-20T00:09Z
>>> Hadoop Version: 2.6.0-cdh5.4.2 from
>>> 15b703c8725733b7b2813d2325659eb7d57e7a3f by jenkins source checksum
>>> de74f1adb3744f8ee85d9a5b98f90d on 2015-05-20T00:03Z
>>>
>>
>>
>

Reply via email to