Hi, Liam
It seems that there is something wrong with your hadoop configuration,
please try to double check $KYLIN_HOME/conf/hbase-site.xml to see if any
property is mis-configured.
There is a post that maybe can help:
http://hortonworks.com/community/forums/topic/set-up-of-mapreduce-jobhistory-server-fails/
liam <[email protected]>于2015年8月1日周六 下午6:12写道:
> Hi,all
>
> Sorry to be annoying .
> like to log message showed bellow,
> Should I run kylin on the namenode ?
> Or should I add some configurations for "URI
> hdfs://bicluster/tmp/kylin_data/coprocessor/kylin-coprocessor-0.7.2-incubating-3.jar"
> someplace?
> Thanks .
>
> Failed in step 13 when build cube
>
>
>
> [------------ERROR Message-------------]
>
> [pool-7-thread-1]:[2015-08-01
> 17:34:28,868][INFO][org.apache.kylin.job.tools.DeployCoprocessorCLI.addCoprocessorOnHTable(DeployCoprocessorCLI.java:119)]
> - Add coprocessor on KYLIN_N1HVO0SD5I
>
> [pool-7-thread-1]:[2015-08-01
> 17:34:28,869][INFO][org.apache.kylin.job.tools.DeployCoprocessorCLI.deployCoprocessor(DeployCoprocessorCLI.java:99)]
> - hbase table [B@38f07651 deployed with coprocessor.)
>
> at
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:93)
>
> at
> org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3389)
>
> at
> org.apache.hadoop.hbase.client.HBaseAdmin.createTableAsync(HBaseAdmin.java:631)
>
> at
> org.apache.hadoop.hbase.client.HBaseAdmin.createTable(HBaseAdmin.java:522)
>
> at
> org.apache.kylin.job.hadoop.hbase.CreateHTableJob.run(CreateHTableJob.java:123)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>
> at
> org.apache.kylin.job.common.HadoopShellExecutable.doWork(HadoopShellExecutable.java:63)
>
> at
> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:106)
>
> at
> org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:50)
>
> at
> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:106)
>
> at
> org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:133)
>
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>
> at java.lang.Thread.run(Thread.java:745)
>
> Caused by:
> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOException):
> org.apache.hadoop.hbase.DoNotRetryIOException: java.io.IOException:
> Couldn't create proxy provider class
> org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider
>
> at
> org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1329)
>
> at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1269)
>
> at
> org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:398)
>
> at
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:42436)
>
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
>
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
>
> at
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>
> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>
> at java.lang.Thread.run(Thread.java:745)
>
> Caused by: java.io.IOException: Couldn't create proxy provider class
> org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider
>
> at
> org.apache.hadoop.hdfs.NameNodeProxies.createFailoverProxyProvider(NameNodeProxies.java:478)
>
> at
> org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:148)
>
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:602)
>
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:547)
>
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:139)
>
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)
>
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
>
> at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2625)
>
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2607)
>
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
>
> at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
>
> at
> org.apache.hadoop.hbase.util.CoprocessorClassLoader.init(CoprocessorClassLoader.java:165)
>
> at
> org.apache.hadoop.hbase.util.CoprocessorClassLoader.getClassLoader(CoprocessorClassLoader.java:250)
>
> at
> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.testTableCoprocessorAttrs(RegionCoprocessorHost.java:316)
>
> at
> org.apache.hadoop.hbase.master.HMaster.checkClassLoading(HMaster.java:1483)
>
> at
> org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1327)
>
> ... 8 more
>
> Caused by: java.lang.reflect.InvocationTargetException
>
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>
> at
> org.apache.hadoop.hdfs.NameNodeProxies.createFailoverProxyProvider(NameNodeProxies.java:461)
>
> ... 23 more
>
> Caused by: java.lang.RuntimeException: Could not find any configured
> addresses for URI
> hdfs://bicluster/tmp/kylin_data/coprocessor/kylin-coprocessor-0.7.2-incubating-3.jar
>
> at
> org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider.<init>(ConfiguredFailoverProxyProvider.java:93)
>
> ... 28 more
>
>
> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1457)
>
> at
> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1661)
>
> at
> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1719)
>
> at
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.createTable(MasterProtos.java:44788)
>
> at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$5.createTable(HConnectionManager.java:1949)
>
> at org.apache.hadoop.hbase.client.HBaseAdmin$2.call(HBaseAdmin.java:635)
>
> at org.apache.hadoop.hbase.client.HBaseAdmin$2.call(HBaseAdmin.java:631)
>
> at
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:117)
>
> ... 15 more
>
> [pool-7-thread-1]:[2015-08-01
> 17:34:28,906][ERROR][org.apache.kylin.job.hadoop.hbase.CreateHTableJob.run(CreateHTableJob.java:130)]
> - org.apache.hadoop.hbase.DoNotRetryIOException: java.io.IOException:
> Couldn't create proxy provider class
> org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider
>
> at
> org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1329)
>
> at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1269)
>
> at
> org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:398)
>
> at
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:42436)
>
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
>
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
>
> at
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>
> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>
> at java.lang.Thread.run(Thread.java:745)
>
> Caused by: java.io.IOException: Couldn't create proxy provider class
> org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider
>
> at
> org.apache.hadoop.hdfs.NameNodeProxies.createFailoverProxyProvider(NameNodeProxies.java:478)
>
> at
> org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:148)
>
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:602)
>
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:547)
>
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:139)
>
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)
>
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
>
> at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2625)
>
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2607)
>
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
>
> at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
>
> at
> org.apache.hadoop.hbase.util.CoprocessorClassLoader.init(CoprocessorClassLoader.java:165)
>
> at
> org.apache.hadoop.hbase.util.CoprocessorClassLoader.getClassLoader(CoprocessorClassLoader.java:250)
>
> at
> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.testTableCoprocessorAttrs(RegionCoprocessorHost.java:316)
>
> at
> org.apache.hadoop.hbase.master.HMaster.checkClassLoading(HMaster.java:1483)
>
> at
> org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1327)
>
> ... 8 more
>
> Caused by: java.lang.reflect.InvocationTargetException
>
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>
> at
> org.apache.hadoop.hdfs.NameNodeProxies.createFailoverProxyProvider(NameNodeProxies.java:461)
>
> ... 23 more
>
> Caused by: java.lang.RuntimeException: Could not find any configured
> addresses for URI
> hdfs://bicluster/tmp/kylin_data/coprocessor/kylin-coprocessor-0.7.2-incubating-3.jar
>
> at
> org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider.<init>(ConfiguredFailoverProxyProvider.java:93)
>
> ... 28 more
>
> --
Best Regard
ZhouQianhao