This is indeed a environment issue to me, and because others don't have
access to your environment, it is often yourself who can finally solve the
issue.

On Wed, Aug 8, 2018 at 9:49 AM Ashish Singhi <[email protected]>
wrote:

> Can you disable kylin.storage.hbase.compression-codec in kylin.properties
> and check once.
>
> If this doesn't help you, please check and share the error logs from kylin
> log file.
>
> Regards,
> Ashsih
>
> On Wed, Aug 8, 2018 at 1:48 PM, Zhixiong Chen <[email protected]>
> wrote:
>
> > Hi:
> >   非常非常感谢您的回复,我按您的方式,把这些都做了配置,包括环境变量,配置如下:
> >
> > <property>
> >  <name>yarn.app.mapreduce.am.env</name>
> >  <value>HADOOP_MAPRED_HOME=/bigdata/tools/hadoop-2.7.3</value>
> > </property>
> >
> > <property>
> >  <name>mapreduce.map.env</name>
> >  <value>HADOOP_MAPRED_HOME=/bigdata/tools/hadoop-2.7.3</value>
> > </property>
> >
> > <property>
> >  <name>mapreduce.reduce.env</name>
> >  <value>HADOOP_MAPRED_HOME=/bigdata/tools/hadoop-2.7.3</value>
> > </property>
> >  <property>
> >   <name>yarn.app.mapreduce.am.admin.user.env</name>
> >   <value>LD_LIBRARY_PATH=$HADOOP_COMMON_HOME/lib/native:
> > $JAVA_LIBRARY_PATH</value>
> >  </property>
> >
> > 另外,还将yarn.resourcemanager.address信息添加到了kylin_job_conf.
> > xml这个文件中,原来Cube创建只能执行到第三步就报错,此时可以执行到第十步, 但此问题还是依然存在,只是之前在第三步#3 Step Name:
> > Extract Fact Table Distinct Columns时报错,现在第十步又报同样的错误#10 Step Name: Build
> > Cube In-Mem Duration: 20.19 mins Waiting: 0 seconds:, 请协助一下,我们万分感激!#10
> Step
> > Name: Build Cube In-Mem Duration: 20.19 mins Waiting: 0 seconds的错误信息如下:
> >
> >
> > java.net.ConnectException: Call From hsmaster/10.9.0.86 to
> > localhost:18032 failed on connection exception:
> java.net.ConnectException:
> > Connection refused; For more details see:
> http://wiki.apache.org/hadoop/
> > ConnectionRefused
> >         at sun.reflect.GeneratedConstructorAccessor76.newInstance(Unknown
> > Source)
> >         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> > DelegatingConstructorAccessorImpl.java:45)
> >         at
> java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> >         at org.apache.hadoop.net.NetUtils.wrapWithMessage(
> > NetUtils.java:792)
> >         at org.apache.hadoop.net
> .NetUtils.wrapException(NetUtils.java:732)
> >         at org.apache.hadoop.ipc.Client.call(Client.java:1479)
> >         at org.apache.hadoop.ipc.Client.call(Client.java:1412)
> >         at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.
> > invoke(ProtobufRpcEngine.java:229)
> >         at com.sun.proxy.$Proxy66.getNewApplication(Unknown Source)
> >         at org.apache.hadoop.yarn.api.impl.pb.client.
> > ApplicationClientProtocolPBClientImpl.getNewApplication(
> > ApplicationClientProtocolPBClientImpl.java:221)
> >         at sun.reflect.GeneratedMethodAccessor108.invoke(Unknown Source)
> >         at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> > DelegatingMethodAccessorImpl.java:43)
> >         at java.lang.reflect.Method.invoke(Method.java:498)
> >         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(
> > RetryInvocationHandler.java:191)
> >         at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(
> > RetryInvocationHandler.java:102)
> >         at com.sun.proxy.$Proxy67.getNewApplication(Unknown Source)
> >         at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.
> > getNewApplication(YarnClientImpl.java:219)
> >         at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.
> > createApplication(YarnClientImpl.java:227)
> >         at org.apache.hadoop.mapred.ResourceMgrDelegate.getNewJobID(
> > ResourceMgrDelegate.java:187)
> >         at org.apache.hadoop.mapred.YARNRunner.getNewJobID(
> > YARNRunner.java:231)
> >         at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(
> > JobSubmitter.java:153)
> >         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
> >         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at javax.security.auth.Subject.doAs(Subject.java:422)
> >         at org.apache.hadoop.security.UserGroupInformation.doAs(
> > UserGroupInformation.java:1698)
> >         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
> >         at org.apache.kylin.engine.mr.common.AbstractHadoopJob.
> > waitForCompletion(AbstractHadoopJob.java:175)
> >         at org.apache.kylin.engine.mr.steps.InMemCuboidJob.run(
> > InMemCuboidJob.java:121)
> >         at org.apache.kylin.engine.mr.common.MapReduceExecutable.
> > doWork(MapReduceExecutable.java:130)
> >         at org.apache.kylin.job.execution.AbstractExecutable.
> > execute(AbstractExecutable.java:162)
> >         at
> org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(
> > DefaultChainedExecutable.java:67)
> >         at org.apache.kylin.job.execution.AbstractExecutable.
> > execute(AbstractExecutable.java:162)
> >         at org.apache.kylin.job.impl.threadpool.DefaultScheduler$
> > JobRunner.run(DefaultScheduler.java:300)
> >         at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> >         at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> >         at java.lang.Thread.run(Thread.java:748)
> > Caused by: java.net.ConnectException: Connection refused
> >         at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> >         at sun.nio.ch.SocketChannelImpl.finishConnect(
> > SocketChannelImpl.java:717)
> >         at org.apache.hadoop.net.SocketIOWithTimeout.connect(
> > SocketIOWithTimeout.java:206)
> >         at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
> >         at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
> >         at org.apache.hadoop.ipc.Client$Connection.setupConnection(
> > Client.java:614)
> >         at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(
> > Client.java:712)
> >         at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.
> > java:375)
> >         at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
> >         at org.apache.hadoop.ipc.Client.call(Client.java:1451)
> >         ... 31 more
> > result code:2
> >
> >     ________________________________
> >     Jason Lu
> >   Email:[email protected]
> >
> >
> > On 08/08/2018, 12:22 PM, "Ashish Singhi" <[email protected]>
> wrote:
> >
> >     Hi,
> >
> >     Can you check yarn application logs and check whether AM was started
> ?
> >
> >     I think you need to
> >     set yarn.app.mapreduce.am.env, mapreduce.map.env,
> mapreduce.reduce.env
> >     and yarn.app.mapreduce.am.admin.user.env in your mapred-site.xml
> file.
> >
> >     I too faced this issue a couple of days back, after setting this
> >     configurations the issue resolved to me.
> >
> >     Regards,
> >     Ashish
> >
> >     On Wed, Aug 8, 2018 at 7:54 AM, Zhixiong Chen <
> > [email protected]>
> >     wrote:
> >
> >     >
> >     > Hi, chen<mailto:[email protected]>:
> >     >
> >     >
> >     > 我们在使用Kylin创建Cube第三步的时候,kylin 返回 localhost:18032
> >     > failed错误,我们花了很长时间也没有把这个问题解决,不知道是不是版本BUG还是其它原因,请求chen<mailto:
> >     > [email protected]>协助一下,我们万分感激!以下是使用的版本和错误信息,请求chen<mailto:ch
> > [email protected]
> >     > >协助一下,谢谢!!!
> >     > Hadoop(版本apache 2.7.3)集群全部启动正常,Kylin(版本2.3)在创建Cube到#3 Step Name:
> > Extract
> >     > Fact Table Distinct Columns时报如下错误:
> >     >
> >     >
> >     > java.net.ConnectException: Call From hsmaster/10.9.0.86 to
> >     > localhost:18032 failed on connection exception:
> > java.net.ConnectException:
> >     > Connection refused; For more details see:
> > http://wiki.apache.org/hadoop/
> >     > ConnectionRefused
> >     >
> >     >         at sun.reflect.GeneratedConstructorAccessor60
> > .newInstance(Unknown
> >     > Source)
> >     >
> >     >         at sun.reflect.DelegatingConstructorAccessorI
> > mpl.newInstance(
> >     > DelegatingConstructorAccessorImpl.java:45)
> >     >
> >     >         at java.lang.reflect.Constructor.
> > newInstance(Constructor.java:423)
> >     >
> >     >         at org.apache.hadoop.net.NetUtils.wrapWithMessage(
> >     > NetUtils.java:792)
> >     >
> >     >         at org.apache.hadoop.net.NetUtils.wrapException(
> > NetUtils.java:732)
> >     >
> >     >         at org.apache.hadoop.ipc.Client.call(Client.java:1479)
> >     >
> >     >         at org.apache.hadoop.ipc.Client.call(Client.java:1412)
> >     >
> >     >         at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.
> >     > invoke(ProtobufRpcEngine.java:229)
> >     >
> >     >         at com.sun.proxy.$Proxy65.getNewApplication(Unknown Source)
> >     >
> >     >         at org.apache.hadoop.yarn.api.impl.pb.client.
> >     > ApplicationClientProtocolPBClientImpl.getNewApplication(
> >     > ApplicationClientProtocolPBClientImpl.java:221)
> >     >
> >     >         at sun.reflect.GeneratedMethodAccessor87.invoke(Unknown
> > Source)
> >     >
> >     >         at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> >     > DelegatingMethodAccessorImpl.java:43)
> >     >
> >     >         at java.lang.reflect.Method.invoke(Method.java:498)
> >     >
> >     >         at org.apache.hadoop.io.retry.RetryInvocationHandler.
> > invokeMethod(
> >     > RetryInvocationHandler.java:191)
> >     >
> >     >         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(
> >     > RetryInvocationHandler.java:102)
> >     >
> >     >         at com.sun.proxy.$Proxy66.getNewApplication(Unknown Source)
> >     >
> >     >         at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.
> >     > getNewApplication(YarnClientImpl.java:219)
> >     >
> >     >         at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.
> >     > createApplication(YarnClientImpl.java:227)
> >     >
> >     >         at
> org.apache.hadoop.mapred.ResourceMgrDelegate.getNewJobID(
> >     > ResourceMgrDelegate.java:187)
> >     >
> >     >         at org.apache.hadoop.mapred.YARNRunner.getNewJobID(
> >     > YARNRunner.java:231)
> >     >
> >     >         at org.apache.hadoop.mapreduce.JobSubmitter.
> > submitJobInternal(
> >     > JobSubmitter.java:153)
> >     >
> >     >         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
> >     >
> >     >         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
> >     >
> >     >         at java.security.AccessController.doPrivileged(Native
> > Method)
> >     >
> >     >         at javax.security.auth.Subject.doAs(Subject.java:422)
> >     >
> >     >         at org.apache.hadoop.security.UserGroupInformation.doAs(
> >     > UserGroupInformation.java:1698)
> >     >
> >     >         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
> >     >
> >     >         at org.apache.kylin.engine.mr.common.AbstractHadoopJob.
> >     > waitForCompletion(AbstractHadoopJob.java:175)
> >     >
> >     >         at org.apache.kylin.engine.mr
> .steps.FactDistinctColumnsJob.
> >     > run(FactDistinctColumnsJob.java:110)
> >     >
> >     >         at org.apache.kylin.engine.mr.common.MapReduceExecutable.
> >     > doWork(MapReduceExecutable.java:130)
> >     >
> >     >         at org.apache.kylin.job.execution.AbstractExecutable.
> >     > execute(AbstractExecutable.java:162)
> >     >
> >     >         at org.apache.kylin.job.execution.DefaultChainedExecutable.
> > doWork(
> >     > DefaultChainedExecutable.java:67)
> >     >
> >     >         at org.apache.kylin.job.execution.AbstractExecutable.
> >     > execute(AbstractExecutable.java:162)
> >     >
> >     >         at org.apache.kylin.job.impl.threadpool.DefaultScheduler$
> >     > JobRunner.run(DefaultScheduler.java:300)
> >     >
> >     >         at java.util.concurrent.ThreadPoolExecutor.runWorker(
> >     > ThreadPoolExecutor.java:1142)
> >     >
> >     >         at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> >     > ThreadPoolExecutor.java:617)
> >     >
> >     >         at java.lang.Thread.run(Thread.java:748)
> >     >
> >     > Caused by: java.net.ConnectException: Connection refused
> >     >
> >     >         at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> >     >
> >     >         at sun.nio.ch.SocketChannelImpl.finishConnect(
> >     > SocketChannelImpl.java:717)
> >     >
> >     >         at org.apache.hadoop.net.SocketIOWithTimeout.connect(
> >     > SocketIOWithTimeout.java:206)
> >     >
> >     >         at org.apache.hadoop.net
> .NetUtils.connect(NetUtils.java:531)
> >     >
> >     >         at org.apache.hadoop.net
> .NetUtils.connect(NetUtils.java:495)
> >     >
> >     >         at org.apache.hadoop.ipc.Client$Connection.setupConnection(
> >     > Client.java:614)
> >     >
> >     >         at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(
> >     > Client.java:712)
> >     >
> >     >         at org.apache.hadoop.ipc.Client$
> > Connection.access$2900(Client.
> >     > java:375)
> >     >
> >     >         at org.apache.hadoop.ipc.Client.getConnection(Client.java:
> > 1528)
> >     >
> >     >         at org.apache.hadoop.ipc.Client.call(Client.java:1451)
> >     >
> >     >         ... 31 more
> >     >
> >     > result code:2
> >     >
> >     > 在yarn-site.xml 文件中相应的ip和端口配置如下,若把18030、18031、18032、
> >     > 19033这几个端口的ip改为localhost,这个问题就可以解决,但Hadoop集群无法正常加载,在htt
> >     > p://10.9.0.86:8088/cluster/nodes中只有一个主节点,其它集群节点都无法加载。
> >     >
> >     >
> >     > <property>
> >     > <name>yarn.resouremanager.hostname</name>
> >     > <value>10.9.0.86</value>
> >     > </property>
> >     > <property>
> >     > <name>yarn.resourcemanager.scheduler.address</name>
> >     > <value>10.9.0.86:18030</value>
> >     > </property>
> >     > <property>
> >     > <name>yarn.resourcemanager.resource-tracker.address</name>
> >     > <value>10.9.0.86:18031</value>
> >     > </property>
> >     > <property>
> >     > <name>yarn.resourcemanager.address</name>
> >     > <value>10.9.0.86:18032</value>
> >     > </property>
> >     > <property>
> >     > <name>yarn.resourcemanager.admin.address</name>
> >     > <value>10.9.0.86:18033</value>
> >     > </property>
> >     > <property>
> >     > <name>yarn.resourcemanager.webapp.address</name>
> >     > <value>10.9.0.86:8088</value>
> >     > </property>
> >     >
> >     > <property>
> >     > <name>yarn.resourcemanager.webapp.https.address</name>
> >     > <value>10.9.0.86:8090</value>
> >     > </property>
> >     >
> >     >
> >     >
> >     >
> >     >
> >     >
> >     > ________________________________
> >     > Jason Lu
> >     > Email:[email protected]
> >     >
> >
> >
> >
>

Reply via email to