The root cause is clear as the log said: ClassNotFoundException: com.
codahale.metrics.Gauge

The codehale metrics-core library should be on the Hive classpath.

2017-02-16 20:15 GMT+08:00 zhangtianyu...@chinasofti.com <
zhangtianyu...@chinasofti.com>:

> hi:
>     I don't know why this is the case。why?
>
>
> log
>
> System
>
>     Help
>     Welcome, ADMIN
>
>     Jobs
>     Slow Queries
>
> Cube Name:
> Jobs in:
> NEW PENDING RUNNING FINISHED ERROR DISCARDED
> Job Name  Cube  Progress  Last Modified Time  Duration  Actions
> kylin_sales_cube - 20120101000000_20170202063000
> - BUILD - GMT+08:00 2017-02-16 15:32:39  kylin_sales_cube
> ERROR
> 2017-02-16 15:34:05 GMT+8  0.90 mins
>
> kylin_sales_cube - 20120101000000_20170201021000
> - BUILD - GMT+08:00 2017-02-16 14:08:43  kylin_sales_cube
> 10.00%
> 2017-02-16 15:32:31 GMT+8  1.07 mins
>
> Total: 2
>
>     Detail Information
>     Job Name  kylin_sales_cube - 20120101000000_20170202063000
> - BUILD - GMT+08:00 2017-02-16 15:32:39
>     Job ID  d5b4a9be-ab48-435d-9555-634477c96e91
>     Status  ERROR
>     Duration  0.90 mins
>     MapReduce Waiting  0.00 mins
>     Start   2017-02-16 15:33:11 GMT+8
>     2017-02-16 15:33:11 GMT+8
>     #1 Step Name: Create Intermediate Flat Hive Table
>     Duration: 0.43 mins
>     2017-02-16 15:33:37 GMT+8
>     #2 Step Name: Redistribute Flat Hive Table
>     Duration: 0.47 mins
>     2017-02-16 15:34:05 GMT+8
>     #3 Step Name: Extract Fact Table Distinct Columns
>     Duration: 0.00 mins
>     #4 Step Name: Build Dimension Dictionary
>     Duration: 0 seconds
>     #5 Step Name: Save Cuboid Statistics
>     Duration: 0 seconds
>     #6 Step Name: Create HTable
>     Duration: 0 seconds
>     #7 Step Name: Build Base Cuboid Data
>     Duration: 0 seconds
>     #8 Step Name: Build N-Dimension Cuboid Data : 8-Dimension
>     Duration: 0 seconds
>     #9 Step Name: Build N-Dimension Cuboid Data : 7-Dimension
>     Duration: 0 seconds
>     #10 Step Name: Build N-Dimension Cuboid Data : 6-Dimension
>     Duration: 0 seconds
>     #11 Step Name: Build N-Dimension Cuboid Data : 5-Dimension
>     Duration: 0 seconds
>     #12 Step Name: Build N-Dimension Cuboid Data : 4-Dimension
>     Duration: 0 seconds
>     #13 Step Name: Build N-Dimension Cuboid Data : 3-Dimension
>     Duration: 0 seconds
>     #14 Step Name: Build N-Dimension Cuboid Data : 2-Dimension
>     Duration: 0 seconds
>     #15 Step Name: Build N-Dimension Cuboid Data : 1-Dimension
>     Duration: 0 seconds
>     #16 Step Name: Build Cube
>     Duration: 0 seconds
>     #17 Step Name: Convert Cuboid Data to HFile
>     Duration: 0 seconds
>     #18 Step Name: Load HFile to HBase Table
>     Duration: 0 seconds
>     #19 Step Name: Update Cube Info
>     Duration: 0 seconds
>     #20 Step Name: Hive Cleanup
>     Duration: 0 seconds
>     End
>
> Apache Kylin | Apache Kylin Community
> Output
>
> java.lang.RuntimeException: java.io.IOException: com.
> google.common.util.concurrent.UncheckedExecutionException:
> java.lang.RuntimeException: Unable to instantiate org.
> apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient
> at org.apache.kylin.source.hive.HiveMRInput$HiveTableInputFormat.
> configureJob(HiveMRInput.java:94)
> at org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.setupMapper(
> FactDistinctColumnsJob.java:119)
> at org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.
> run(FactDistinctColumnsJob.java:103)
> at org.apache.kylin.engine.mr.MRUtil.runMRJob(MRUtil.java:88)
> at org.apache.kylin.engine.mr.common.MapReduceExecutable.
> doWork(MapReduceExecutable.java:120)
> at org.apache.kylin.job.execution.AbstractExecutable.
> execute(AbstractExecutable.java:113)
> at org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(
> DefaultChainedExecutable.java:57)
> at org.apache.kylin.job.execution.AbstractExecutable.
> execute(AbstractExecutable.java:113)
> at org.apache.kylin.job.impl.threadpool.DefaultScheduler$
> JobRunner.run(DefaultSc
> heduler.java:136)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.io.IOException: com.google.common.util.concurrent.
> UncheckedExecutionException: java.lang.RuntimeException:
> Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$
> CacheableHiveMetaStoreClient
> at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.
> setInput(HCatInputFormat.java:97)
> at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.
> setInput(HCatInputFormat.java:51)
> at org.apache.kylin.source.hive.HiveMRInput$HiveTableInputFormat.
> configureJob(HiveMRInput.java:89)
> ... 11 more
> Caused by: com.google.common.util.concurrent.UncheckedExecutionException:
> java.lang.RuntimeException: Unable to instantiate org.
> apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient
> at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2263)
> at com.google.common.cache.LocalCache.get(LocalCache.java:4000)
> at com.google.common.cache.LocalCache$LocalManualCache.
> get(LocalCache.java:4789)
> at org.apache.hive.hcatalog.common.HiveClientCache.
> getOrCreate(HiveClientCache.java:227)
> at org.apache.hive.hcatalog.common.HiveClientCache.get(
> HiveClientCache.java:202)
> at org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(
> HCatUtil.java:558)
> at org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(
> InitializeInput.java:104)
> at org.apache.hive.hcatalog.mapreduce.InitializeInput.
> setInput(InitializeInput.java:86)
> at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.
> setInput(HCatInputFormat.java:95)
> ... 13 more
> Caused by: java.lang.RuntimeException: Unable to
> instantiate org.apache.hive.hcatalog.common.HiveClientCache$
> CacheableHiveMetaStoreClient
> at org.apache.hadoop.hive.metastore.MetaStoreUtils.
> newInstance(MetaStoreUtils.java:1532)
> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>
> (RetryingMetaStoreClient.java:87)
> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(
> RetryingMetaStoreClient.java:133)
> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(
> RetryingMetaStoreClient.java:119)
> at org.apache.hive.hcatalog.common.HiveClientCache$5.call(
> HiveClientCache.java:230)
> at org.apache.hive.hcatalog.common.HiveClientCache$5.call(
> HiveClientCache.java:227)
> at com.google.common.cache.LocalCache$LocalManualCache$1.
> load(LocalCache.java:4792)
> at com.google.common.cache.LocalCache$LoadingValueReference.
> loadFuture(LocalCache.java:3599)
> at com.google.common.cache.LocalCache$Segment.loadSync(
> LocalCache.java:2379)
> at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.
> java:2342)
> at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2257)
> ... 21 more
> Caused by: java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance(
> NativeConstructorAccessorImpl.java:62)
> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.hive.metastore.MetaStoreUtils.
> newInstance(MetaStoreUtils.java:1530)
> ... 31 more
> Caused by: java.lang.NoClassDefFoundError: com/codahale/metrics/Gauge
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:348)
> at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(
> Configuration.java:2163)
> at org.apache.hadoop.conf.Configuration.getClassByName(
> Configuration.java:2128)
> at org.apache.hadoop.hive.common.metrics.common.MetricsFactory.init(
> MetricsFactory.java:39)
> at org.apache.hadoop.hive.metastore.HiveMetaStore$
> HMSHandler.init(HiveMetaStore.java:500)
> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<
> init>(RetryingHMSHandler.java:78)
> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.
> getProxy(RetryingHMSHandler.java:84)
> at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(
> HiveMetaStore.java:6903)
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.
> <init>(HiveMetaStoreClient.java:212)
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.
> <init>(HiveMetaStoreClient.java:194)
> at org.apache.hive.hcatalog.common.HiveClientCache$
> CacheableHiveMetaStoreClient.<init>(HiveClientCache.java:330)
> ... 36 more
> Caused by: java.lang.ClassNotFoundException: com.codahale.metrics.Gauge
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> ... 48 more
>
> result code:2
>
>
> ------------------------------
>
> 咨询顾问,数据服务业务线,*张天宇*
>
> *手机* 186-8686-6455 <186%208686%206455>
>
> *地址 *北京中关村科学院南路2号融科资讯中心C座北楼15层
>
> zhangtianyu...@chinasofti.com <yourem...@chinasofti.com> |
> www.chinasofti.com
>

Reply via email to