Re: spark task error occurs when run IT in sanbox

2019-03-25 Thread yuzhang
Hi elkan:
 Thank you take time to reply.
 Just as you said, the reason is the unmatched jdk version. I just set 
root's JAVA_HOME to point jdk1.8, but every server in sandbox has it's own user 
to run it. So I should re-link the original JAVA_HOME to new one.


Best regards 
yuzhang


| |
yuzhang
|
|
shifengdefan...@163.com
|
签名由网易邮箱大师定制
On 3/25/2019 13:32,elkan1788 wrote:
Seems there your Java running time environment was not clean. Please check
the JAVA_HOME and PATH system variable, use the echo command see what output
from them.

By the way the Kylin also can run in Hadoop clusters which use JDK1.7,  just
a simple modify. The steps like this:

1. modify the HBase conf file which name is hbase-env.sh, add export
JAVA_HOME=/path/of/jdk1.8

2. append the below configure into kylin_job_conf.xml and
kylin_job_conf_inmem.xml files.


mapred.child.env
JAVA_HOME=/usr/lib/java/jdk1.8.0_201



yarn.app.mapreduce.am.env
JAVA_HOME=/usr/lib/java/jdk1.8.0_201


Hope those can help you!

--
Sent from: http://apache-kylin.74782.x6.nabble.com/


Re: spark task error occurs when run IT in sanbox

2019-03-24 Thread elkan1788
Seems there your Java running time environment was not clean. Please check
the JAVA_HOME and PATH system variable, use the echo command see what output
from them. 

By the way the Kylin also can run in Hadoop clusters which use JDK1.7,  just
a simple modify. The steps like this:

1. modify the HBase conf file which name is hbase-env.sh, add export
JAVA_HOME=/path/of/jdk1.8

2. append the below configure into kylin_job_conf.xml and
kylin_job_conf_inmem.xml files.


mapred.child.env
JAVA_HOME=/usr/lib/java/jdk1.8.0_201



yarn.app.mapreduce.am.env
JAVA_HOME=/usr/lib/java/jdk1.8.0_201


Hope those can help you!

--
Sent from: http://apache-kylin.74782.x6.nabble.com/