是的,我们公司使用的是apache开源的hadoop2.7.3的版本,全部都是手动启动,手动配置,但这些环境在跑mapreduce、spark、hive、hbase程序都没有问题,而且已经运行很长时间了,有好多任务在跑,重新换CDH、HDP这些,从公司层面也不现实,而现在接入kylin就报这个错误,不知道是否是版本问题?或者是不支持开源apache hadoop?还是请求大师能协助处理一下, 我们万分感激!
2018-08-10 10:17 GMT+08:00 ShaoFeng Shi <[email protected]>: > Hi Terry, > > I see your email several days ago, but I have no idea about the issue. It > should be some environment problems I believe. > > Are you setting up the Hadoop cluster by manual? Usually, if you're not a > Hadoop expert, we recommend to start with the commercial Hadoop releases > like HDP, CDH, AWS EMR, etc; That could save your lots of time and effort. > > > > 2018-08-10 10:07 GMT+08:00 Terry Lu <[email protected]>: > > > Hi: > > > > 我们在使用 Kylin(kylin版本是2.3, Hadoop是apache hadoop-2.7.3,而且集群全部启动正常 )在创建 > > Cube到#3 Step Name: Extract Fact Table Distinct Columns时报 localhost:18032 > > failed错误,然后就将yarn.resourcemanager.address信息添加到了kylin_job_conf. > xml这个文件中,原来 > > Cube创建只能执行到第三步就报错,此时可以执行到第十步, 但此问题还是依然存在,而这个问题也没有解决, > > 附件是Kylin的日志信息,还有kylin及yarn、mapred的配置文件,请求大师能协助处理一下,我们万分感激! > > > > 以下是#10 Step Name: Build Cube In-Mem Duration: 20.19 mins Waiting: 0 > > seconds的错误信息如下: > > > > > > > > java.net.ConnectException: Call From hsmaster/10.9.0.86 to > localhost:18032 failed on connection exception: java.net.ConnectException: > Connection refused; For more details see: http://wiki.apache.org/hadoop/ > ConnectionRefused > > at sun.reflect.GeneratedConstructorAccessor75.newInstance(Unknown > Source) > > at sun.reflect.DelegatingConstructorAccessorImpl.newInstance( > DelegatingConstructorAccessorImpl.java:45) > > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > > at org.apache.hadoop.net.NetUtils.wrapWithMessage( > NetUtils.java:792) > > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732) > > at org.apache.hadoop.ipc.Client.call(Client.java:1479) > > at org.apache.hadoop.ipc.Client.call(Client.java:1412) > > at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker. > invoke(ProtobufRpcEngine.java:229) > > at com.sun.proxy.$Proxy66.getNewApplication(Unknown Source) > > at org.apache.hadoop.yarn.api.impl.pb.client. > ApplicationClientProtocolPBClientImpl.getNewApplication( > ApplicationClientProtocolPBClientImpl.java:221) > > at sun.reflect.GeneratedMethodAccessor122.invoke(Unknown Source) > > at sun.reflect.DelegatingMethodAccessorImpl.invoke( > DelegatingMethodAccessorImpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:498) > > at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod( > RetryInvocationHandler.java:191) > > at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke( > RetryInvocationHandler.java:102) > > at com.sun.proxy.$Proxy67.getNewApplication(Unknown Source) > > at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl. > getNewApplication(YarnClientImpl.java:219) > > at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl. > createApplication(YarnClientImpl.java:227) > > at org.apache.hadoop.mapred.ResourceMgrDelegate.getNewJobID( > ResourceMgrDelegate.java:187) > > at org.apache.hadoop.mapred.YARNRunner.getNewJobID( > YARNRunner.java:231) > > at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal( > JobSubmitter.java:153) > > at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290) > > at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:422) > > at org.apache.hadoop.security.UserGroupInformation.doAs( > UserGroupInformation.java:1698) > > at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287) > > at org.apache.kylin.engine.mr.common.AbstractHadoopJob. > waitForCompletion(AbstractHadoopJob.java:175) > > at org.apache.kylin.engine.mr.steps.InMemCuboidJob.run( > InMemCuboidJob.java:121) > > at org.apache.kylin.engine.mr.common.MapReduceExecutable. > doWork(MapReduceExecutable.java:130) > > at org.apache.kylin.job.execution.AbstractExecutable. > execute(AbstractExecutable.java:162) > > at org.apache.kylin.job.execution.DefaultChainedExecutable.doWork( > DefaultChainedExecutable.java:67) > > at org.apache.kylin.job.execution.AbstractExecutable. > execute(AbstractExecutable.java:162) > > at org.apache.kylin.job.impl.threadpool.DefaultScheduler$ > JobRunner.run(DefaultScheduler.java:300) > > at java.util.concurrent.ThreadPoolExecutor.runWorker( > ThreadPoolExecutor.java:1142) > > at java.util.concurrent.ThreadPoolExecutor$Worker.run( > ThreadPoolExecutor.java:617) > > at java.lang.Thread.run(Thread.java:748) > > Caused by: java.net.ConnectException: Connection refused > > at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) > > at sun.nio.ch.SocketChannelImpl.finishConnect( > SocketChannelImpl.java:717) > > at org.apache.hadoop.net.SocketIOWithTimeout.connect( > SocketIOWithTimeout.java:206) > > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531) > > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495) > > at org.apache.hadoop.ipc.Client$Connection.setupConnection( > Client.java:614) > > at org.apache.hadoop.ipc.Client$Connection.setupIOstreams( > Client.java:712) > > at org.apache.hadoop.ipc.Client$Connection.access$2900(Client. > java:375) > > at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528) > > at org.apache.hadoop.ipc.Client.call(Client.java:1451) > > ... 31 more > > result code:2 > > > > > > > > > > -- > Best regards, > > Shaofeng Shi 史少锋 >
