Hi ShaoFeng Shi:

It is the detail info for the error! 

" --hiveconf hive.merge.mapredfiles=false --hiveconf 
hive.auto.convert.join=true --hiveconf dfs.replication=2 --hiveconf 
hive.exec.compress.output=true --hiveconf 
hive.auto.convert.join.noconditionaltask=true --hiveconf 
mapreduce.job.split.metainfo.maxsize=-1 --hiveconf hive.merge.mapfiles=false 
--hiveconf hive.auto.convert.join.noconditionaltask.size=100000000 --hiveconf 
hive.stats.autogather=true
        at 
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:178)
        at 
org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:69)
        at 
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:163)
        ... 4 more
Caused by: java.io.IOException: OS command error exit with return code: 1, 
error message: SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/data/hadoop-enviorment/apache-hive-2.3.3/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/data/hadoop-enviorment/hdp-hadoop-3.1.0/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

Logging initialized using configuration in 
file:/data/hadoop-enviorment/apache-hive-2.3.3/conf/hive-log4j2.properties 
Async: true
OK
Time taken: 5.286 seconds
OK
Time taken: 1.846 seconds
OK
Time taken: 0.405 seconds
OK
Time taken: 0.235 seconds
Exception in thread "main" java.lang.NoSuchMethodError: 
com.google.protobuf.Descriptors$Descriptor.getOneofs()Ljava/util/List;
        at 
com.google.protobuf.GeneratedMessageV3$FieldAccessorTable.<init>(GeneratedMessageV3.java:1704)
        at org.apache.calcite.avatica.proto.Common.<clinit>(Common.java:18927)
        at 
org.apache.calcite.avatica.proto.Common$ConnectionProperties.getDescriptor(Common.java:1264)
        at 
org.apache.calcite.avatica.ConnectionPropertiesImpl.<clinit>(ConnectionPropertiesImpl.java:38)
        at org.apache.calcite.avatica.MetaImpl.<init>(MetaImpl.java:72)
        at 
org.apache.calcite.jdbc.CalciteMetaImpl.<init>(CalciteMetaImpl.java:88)
        at org.apache.calcite.jdbc.Driver.createMeta(Driver.java:169)
        at 
org.apache.calcite.avatica.AvaticaConnection.<init>(AvaticaConnection.java:121)
        at 
org.apache.calcite.jdbc.CalciteConnectionImpl.<init>(CalciteConnectionImpl.java:113)
        at 
org.apache.calcite.jdbc.CalciteJdbc41Factory$CalciteJdbc41Connection.<init>(CalciteJdbc41Factory.java:114)
        at 
org.apache.calcite.jdbc.CalciteJdbc41Factory.newConnection(CalciteJdbc41Factory.java:59)
        at 
org.apache.calcite.jdbc.CalciteJdbc41Factory.newConnection(CalciteJdbc41Factory.java:44)
        at 
org.apache.calcite.jdbc.CalciteFactory.newConnection(CalciteFactory.java:53)
        at 
org.apache.calcite.avatica.UnregisteredDriver.connect(UnregisteredDriver.java:138)
        at java.sql.DriverManager.getConnection(DriverManager.java:664)
        at java.sql.DriverManager.getConnection(DriverManager.java:208)
        at org.apache.calcite.tools.Frameworks.withPrepare(Frameworks.java:145)
        at org.apache.calcite.tools.Frameworks.withPlanner(Frameworks.java:106)
        at 
org.apache.hadoop.hive.ql.parse.CalcitePlanner.logicalPlan(CalcitePlanner.java:1069)
        at 
org.apache.hadoop.hive.ql.parse.CalcitePlanner.getOptimizedAST(CalcitePlanner.java:1085)
        at 
org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:364)
        at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:11138)
        at 
org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:286)
        at 
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:258)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:512)
        at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
        at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336)
        at 
org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:787)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
The command is:
hive -e "USE default;

DROP TABLE IF EXISTS 
kylin_intermediate_kylin_sales_cube_50cde797_4928_9a76_cdbb_4799ed4ab627;
CREATE EXTERNAL TABLE IF NOT EXISTS 
kylin_intermediate_kylin_sales_cube_50cde797_4928_9a76_cdbb_4799ed4ab627
(
KYLIN_SALES_TRANS_ID bigint
,KYLIN_SALES_PART_DT date
,KYLIN_SALES_LEAF_CATEG_ID bigint
,KYLIN_SALES_LSTG_SITE_ID int
,KYLIN_CATEGORY_GROUPINGS_META_CATEG_NAME string
,KYLIN_CATEGORY_GROUPINGS_CATEG_LVL2_NAME string
,KYLIN_CATEGORY_GROUPINGS_CATEG_LVL3_NAME string
,KYLIN_SALES_LSTG_FORMAT_NAME string
,KYLIN_SALES_SELLER_ID bigint
,KYLIN_SALES_BUYER_ID bigint
,BUYER_ACCOUNT_ACCOUNT_BUYER_LEVEL int
,SELLER_ACCOUNT_ACCOUNT_SELLER_LEVEL int
,BUYER_ACCOUNT_ACCOUNT_COUNTRY string
,SELLER_ACCOUNT_ACCOUNT_COUNTRY string
,BUYER_COUNTRY_NAME string
,SELLER_COUNTRY_NAME string
,KYLIN_SALES_OPS_USER_ID string
,KYLIN_SALES_OPS_REGION string
,KYLIN_SALES_PRICE decimal(19,4)
)
STORED AS SEQUENCEFILE
LOCATION 
'hdfs://hdfscluster/mnt/kylin/kylin_metadata/kylin-5d45e50e-104c-58c7-9288-b4d9c10e779a/kylin_intermediate_kylin_sales_cube_50cde797_4928_9a76_cdbb_4799ed4ab627';
ALTER TABLE 
kylin_intermediate_kylin_sales_cube_50cde797_4928_9a76_cdbb_4799ed4ab627 SET 
TBLPROPERTIES('auto.purge'='true');
INSERT OVERWRITE TABLE 
kylin_intermediate_kylin_sales_cube_50cde797_4928_9a76_cdbb_4799ed4ab627 SELECT
KYLIN_SALES.TRANS_ID as KYLIN_SALES_TRANS_ID
,KYLIN_SALES.PART_DT as KYLIN_SALES_PART_DT
,KYLIN_SALES.LEAF_CATEG_ID as KYLIN_SALES_LEAF_CATEG_ID
,KYLIN_SALES.LSTG_SITE_ID as KYLIN_SALES_LSTG_SITE_ID
,KYLIN_CATEGORY_GROUPINGS.META_CATEG_NAME as 
KYLIN_CATEGORY_GROUPINGS_META_CATEG_NAME
,KYLIN_CATEGORY_GROUPINGS.CATEG_LVL2_NAME as 
KYLIN_CATEGORY_GROUPINGS_CATEG_LVL2_NAME
,KYLIN_CATEGORY_GROUPINGS.CATEG_LVL3_NAME as 
KYLIN_CATEGORY_GROUPINGS_CATEG_LVL3_NAME
,KYLIN_SALES.LSTG_FORMAT_NAME as KYLIN_SALES_LSTG_FORMAT_NAME
,KYLIN_SALES.SELLER_ID as KYLIN_SALES_SELLER_ID
,KYLIN_SALES.BUYER_ID as KYLIN_SALES_BUYER_ID
,BUYER_ACCOUNT.ACCOUNT_BUYER_LEVEL as BUYER_ACCOUNT_ACCOUNT_BUYER_LEVEL
,SELLER_ACCOUNT.ACCOUNT_SELLER_LEVEL as SELLER_ACCOUNT_ACCOUNT_SELLER_LEVEL
,BUYER_ACCOUNT.ACCOUNT_COUNTRY as BUYER_ACCOUNT_ACCOUNT_COUNTRY
,SELLER_ACCOUNT.ACCOUNT_COUNTRY as SELLER_ACCOUNT_ACCOUNT_COUNTRY
,BUYER_COUNTRY.NAME as BUYER_COUNTRY_NAME
,SELLER_COUNTRY.NAME as SELLER_COUNTRY_NAME
,KYLIN_SALES.OPS_USER_ID as KYLIN_SALES_OPS_USER_ID
,KYLIN_SALES.OPS_REGION as KYLIN_SALES_OPS_REGION
,KYLIN_SALES.PRICE as KYLIN_SALES_PRICE
FROM DEFAULT.KYLIN_SALES as KYLIN_SALES
INNER JOIN DEFAULT.KYLIN_CAL_DT as KYLIN_CAL_DT
ON KYLIN_SALES.PART_DT = KYLIN_CAL_DT.CAL_DT
INNER JOIN DEFAULT.KYLIN_CATEGORY_GROUPINGS as KYLIN_CATEGORY_GROUPINGS
ON KYLIN_SALES.LEAF_CATEG_ID = KYLIN_CATEGORY_GROUPINGS.LEAF_CATEG_ID AND 
KYLIN_SALES.LSTG_SITE_ID = KYLIN_CATEGORY_GROUPINGS.SITE_ID
INNER JOIN DEFAULT.KYLIN_ACCOUNT as BUYER_ACCOUNT
ON KYLIN_SALES.BUYER_ID = BUYER_ACCOUNT.ACCOUNT_ID
INNER JOIN DEFAULT.KYLIN_ACCOUNT as SELLER_ACCOUNT
ON KYLIN_SALES.SELLER_ID = SELLER_ACCOUNT.ACCOUNT_ID
INNER JOIN DEFAULT.KYLIN_COUNTRY as BUYER_COUNTRY
ON BUYER_ACCOUNT.ACCOUNT_COUNTRY = BUYER_COUNTRY.COUNTRY
INNER JOIN DEFAULT.KYLIN_COUNTRY as SELLER_COUNTRY
ON SELLER_ACCOUNT.ACCOUNT_COUNTRY = SELLER_COUNTRY.COUNTRY
WHERE 1=1 AND (KYLIN_SALES.PART_DT >= '2012-01-01' AND KYLIN_SALES.PART_DT < 
'2018-10-01')
;

" --hiveconf hive.merge.mapredfiles=false --hiveconf 
hive.auto.convert.join=true --hiveconf dfs.replication=2 --hiveconf 
hive.exec.compress.output=true --hiveconf 
hive.auto.convert.join.noconditionaltask=true --hiveconf 
mapreduce.job.split.metainfo.maxsize=-1 --hiveconf hive.merge.mapfiles=false 
--hiveconf hive.auto.convert.join.noconditionaltask.size=100000000 --hiveconf 
hive.stats.autogather=true
        at 
org.apache.kylin.common.util.CliCommandExecutor.execute(CliCommandExecutor.java:95)
        at 
org.apache.kylin.source.hive.CreateFlatHiveTableStep.createFlatHiveTable(CreateFlatHiveTableStep.java:62)
        at 
org.apache.kylin.source.hive.CreateFlatHiveTableStep.doWork(CreateFlatHiveTableStep.java:99)
        at 
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:163)
        ... 6 more
2018-10-16 13:47:45,750 DEBUG [http-nio-7070-exec-8] cachesync.Broadcaster:281 
: Done broadcasting UPDATE, execute_output, 5d45e50e-104c-58c7-9288-b4d9c10e779a
2018-10-16 13:47:45,760 DEBUG [http-nio-7070-exec-9] cachesync.Broadcaster:247 
: Broadcasting UPDATE, execute_output, 5d45e50e-104c-58c7-9288-b4d9c10e779a
2018-10-16 13:47:45,762 DEBUG [http-nio-7070-exec-9] cachesync.Broadcaster:281 
: Done broadcasting UPDATE, execute_output, 5d45e50e-104c-58c7-9288-b4d9c10e779a
2018-10-16 13:47:45,772 INFO  [FetcherRunner 1193270701-99] 
threadpool.DefaultFetcherRunner:96 : Job Fetcher: 0 should running, 0 actual 
running, 0 stopped, 0 ready, 0 already succeed, 8 error, 1 discarded, 0 others

> 在 2018年10月16日,上午9:56,ShaoFeng Shi <shaofeng...@apache.org> 写道:
> 
> Hi Zx,
> 
> The source code of 1.13.0-kylin-r4. is in Kyligence's fork: 
> https://github.com/Kyligence/calcite <https://github.com/Kyligence/calcite>
> Please provide the full kylin.log, maybe we can find new clue there.
> 
> liuzhixin <liuz...@163.com <mailto:liuz...@163.com>> 于2018年10月15日周一 下午3:06写道:
> Hi ShaoFeng Shi:
> 
> This is the original error,
> 
> When I build the cube and give the error: java.lang.NoClassDefFoundError: 
> com/google/protobuf/GeneratedMessageV3,
> 
> The GeneratedMessageV3 from protobuf-java-3.1.0, and there is really no it in 
> hive.
> 
> Maybe I should assembly it to the atopcalcite.
> 
> But I can run the hive -e command success.
> 
> #
> Time taken: 0.218 seconds
> Exception in thread "main" java.lang.NoClassDefFoundError: 
> com/google/protobuf/GeneratedMessageV3
>       at java.lang.ClassLoader.defineClass1(Native Method)
>       at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
>       at 
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>       at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
>       at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
>       at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
>       at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>       at 
> org.apache.calcite.avatica.ConnectionPropertiesImpl.<clinit>(ConnectionPropertiesImpl.java:38)
>       at org.apache.calcite.avatica.MetaImpl.<init>(MetaImpl.java:72)
>       at 
> org.apache.calcite.jdbc.CalciteMetaImpl.<init>(CalciteMetaImpl.java:88)
>       at org.apache.calcite.jdbc.Driver.createMeta(Driver.java:169)
>       at 
> org.apache.calcite.avatica.AvaticaConnection.<init>(AvaticaConnection.java:121)
>       at 
> org.apache.calcite.jdbc.CalciteConnectionImpl.<init>(CalciteConnectionImpl.java:113)
>       at 
> org.apache.calcite.jdbc.CalciteJdbc41Factory$CalciteJdbc41Connection.<init>(CalciteJdbc41Factory.java:114)
>       at 
> org.apache.calcite.jdbc.CalciteJdbc41Factory.newConnection(CalciteJdbc41Factory.java:59)
>       at 
> org.apache.calcite.jdbc.CalciteJdbc41Factory.newConnection(CalciteJdbc41Factory.java:44)
>       at 
> org.apache.calcite.jdbc.CalciteFactory.newConnection(CalciteFactory.java:53)
>       at 
> org.apache.calcite.avatica.UnregisteredDriver.connect(UnregisteredDriver.java:138)
>       at java.sql.DriverManager.getConnection(DriverManager.java:664)
>       at java.sql.DriverManager.getConnection(DriverManager.java:208)
>       at org.apache.calcite.tools.Frameworks.withPrepare(Frameworks.java:145)
>       at org.apache.calcite.tools.Frameworks.withPlanner(Frameworks.java:106)
>       at 
> org.apache.hadoop.hive.ql.parse.CalcitePlanner.logicalPlan(CalcitePlanner.java:1069)
>       at 
> org.apache.hadoop.hive.ql.parse.CalcitePlanner.getOptimizedAST(CalcitePlanner.java:1085)
>       at 
> org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:364)
>       at 
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:11138)
>       at 
> org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:286)
>       at 
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:258)
>       at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:512)
>       at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
>       at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
>       at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
>       at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
>       at 
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
>       at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
>       at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
>       at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336)
>       at 
> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:787)
>       at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
>       at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:498)
>       at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
>       at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
> Caused by: java.lang.ClassNotFoundException: 
> com.google.protobuf.GeneratedMessageV3
>       at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>       ... 51 more
> The command is:
> hive -e "USE default;
> 
> 
>> 在 2018年10月15日,下午2:35,liuzhixin <liuz...@163.com <mailto:liuz...@163.com>> 写道:
>> 
>> Hi shaofeng:
>> 
>> Yes, I can run the command well in hive shell. 
>> 
>> I can’t find calcite-core version 1.13.0-kylin-r4. 
>> 
>> Best wishes.
>> 
>>> 在 2018年10月15日,下午2:20,ShaoFeng Shi <shaofeng...@apache.org 
>>> <mailto:shaofeng...@apache.org>> 写道:
>>> 
>>> Hive version 2.3.3 can work well with HDP 3? Can you try the HiveQL that
>>> Kylin executed out of Kylin, if it works, then there should be something
>>> wrong in Kylin.
>>> 
>>> liuzhixin <liuz...@163.com <mailto:liuz...@163.com>> 于2018年10月15日周一 
>>> 下午1:47写道:
>>> 
>>>> Thank you for the answer!
>>>> 
>>>> I can’t decide the hive version.
>>>> 
>>>> And the hive version 2.3.3 can work well with HDP 3.
>>>> 
>>>> Perhaps you can test the Kylin with hive version 2.3.3.
>>>> 
>>>> Maybe it’s other error. Thanks!
>>>> 
>>>> Best wishes!
>>>> 
>>>> 
>>>> 在 2018年10月15日,下午1:24,ShaoFeng Shi <shaofeng...@apache.org 
>>>> <mailto:shaofeng...@apache.org>> 写道:
>>>> 
>>>> Hi zhixin,
>>>> 
>>>> I think the problem is how to run Hive 2 with HDP 3, no relation with
>>>> Kylin.
>>>> 
>>>> Usually, we don't encourage user to customize the component version in a
>>>> release, because that may bring dependency conflicts.
>>>> 
>>>> I suggest you use the original Hive version in HDP 3.
>>>> 
>>>> liuzhixin <liuz...@163.com <mailto:liuz...@163.com>> 于2018年10月15日周一 
>>>> 上午11:25写道:
>>>> 
>>>>> Hi ShaoFeng Shi
>>>>> 
>>>>> Yes, the error from hive version 2.3.3,
>>>>> 
>>>>> And Kylin need hive version 3.1.0.
>>>>> 
>>>>> So how to solve the question?
>>>>> 
>>>>> Best wishes!
>>>>> 
>>>>>> 在 2018年10月15日,上午11:10,ShaoFeng Shi <shaofeng...@apache.org 
>>>>>> <mailto:shaofeng...@apache.org>> 写道:
>>>>>> 
>>>>>> Hi Zhixin,
>>>>>> 
>>>>>> The error log is thrown from Hive, not from Kylin I think. Please verify
>>>>>> your hive is properly installed; You can manually run that hive command
>>>>> :
>>>>>> 
>>>>>> hive -e "use default; xxx"
>>>>>> 
>>>>>> Lijun Cao <641507...@qq.com <mailto:641507...@qq.com>> 于2018年10月15日周一 
>>>>>> 上午11:01写道:
>>>>>> 
>>>>>>> Hi liuzhixin:
>>>>>>> 
>>>>>>> As I remember, the Hive version in HDP 3 is 3.1.0 .
>>>>>>> 
>>>>>>> You can update Hive to 3.1.0 and then have another try.
>>>>>>> 
>>>>>>> And according to my previous test, the binary package
>>>>>>> apache-kylin-2.5.0-bin-hadoop3.tar.gz can work properly on HDP 3. You
>>>>> can
>>>>>>> get it form official site.
>>>>>>> 
>>>>>>> Best Regards
>>>>>>> 
>>>>>>> Lijun Cao
>>>>>>> 
>>>>>>>> 在 2018年10月15日,10:22,liuzhixin <liuz...@163.com 
>>>>>>>> <mailto:liuz...@163.com>> 写道:
>>>>>>>> 
>>>>>>>> hi cao lijun,
>>>>>>>> #
>>>>>>>> the platform is ambari hdp3.0, and hive is 2.3.3, hbase version is 2.0
>>>>>>>> 
>>>>>>>> I have compile the source code with hive 2.3.3,
>>>>>>>> 
>>>>>>>> but the module atopcalcite depends on protobuf 3.1.0,
>>>>>>>> 
>>>>>>>> other module depends on protobuf 2.5.0.
>>>>>>>> 
>>>>>>>> 
>>>>>>>>> 在 2018年10月15日,上午8:40,Lijun Cao <641507...@qq.com 
>>>>>>>>> <mailto:641507...@qq.com>> 写道:
>>>>>>>>> 
>>>>>>>>> Hi liuzhixin:
>>>>>>>>> 
>>>>>>>>> Which platform did you use?
>>>>>>>>> 
>>>>>>>>> The CDH 6.0.x or HDP 3.0 ?
>>>>>>>>> 
>>>>>>>>> Best Regards
>>>>>>>>> 
>>>>>>>>> Lijun Cao
>>>>>>>>> 
>>>>>>>>>> 在 2018年10月12日,21:14,liuzhixin <liuz...@163.com 
>>>>>>>>>> <mailto:liuz...@163.com>> 写道:
>>>>>>>>>> 
>>>>>>>>>> Logging initialized using configuration in
>>>>>>> 
>>>>> file:/data/hadoop-enviorment/apache-hive-2.3.3/conf/hive-log4j2.properties
>>>>>>> Async: true
>>>>>>>>>> OK
>>>>>>>>>> Time taken: 4.512 seconds
>>>>>>>>>> OK
>>>>>>>>>> Time taken: 1.511 seconds
>>>>>>>>>> OK
>>>>>>>>>> Time taken: 0.272 seconds
>>>>>>>>>> OK
>>>>>>>>>> Time taken: 0.185 seconds
>>>>>>>>>> Exception in thread "main" java.lang.NoSuchMethodError:
>>>>>>> com.google.protobuf.Descriptors$Descriptor.getOneofs()Ljava/util/List;
>>>>>>>>>>   at
>>>>>>> 
>>>>> com.google.protobuf.GeneratedMessageV3$FieldAccessorTable.<init>(GeneratedMessageV3.java:1704)
>>>>>>>>>>   at
>>>>>>> org.apache.calcite.avatica.proto.Common.<clinit>(Common.java:18927)
>>>>>>>>>>   at
>>>>>>> 
>>>>> org.apache.calcite.avatica.proto.Common$ConnectionProperties.getDescriptor(Common.java:1264)
>>>>>>>>>>   at
>>>>>>> 
>>>>> org.apache.calcite.avatica.ConnectionPropertiesImpl.<clinit>(ConnectionPropertiesImpl.java:38)
>>>>>>>>>>   at org.apache.calcite.avatica.MetaImpl.<init>(MetaImpl.java:72)
>>>>>>>>>>   at
>>>>>>> org.apache.calcite.jdbc.CalciteMetaImpl.<init>(CalciteMetaImpl.java:88)
>>>>>>>>>>   at org.apache.calcite.jdbc.Driver.createMeta(Driver.java:169)
>>>>>>>>>>   at
>>>>>>> 
>>>>> org.apache.calcite.avatica.AvaticaConnection.<init>(AvaticaConnection.java:121)
>>>>>>>>>>   at
>>>>>>> 
>>>>> org.apache.calcite.jdbc.CalciteConnectionImpl.<init>(CalciteConnectionImpl.java:113)
>>>>>>>>>>   at
>>>>>>> 
>>>>> org.apache.calcite.jdbc.CalciteJdbc41Factory$CalciteJdbc41Connection.<init>(CalciteJdbc41Factory.java:114)
>>>>>>>>>>   at
>>>>>>> 
>>>>> org.apache.calcite.jdbc.CalciteJdbc41Factory.newConnection(CalciteJdbc41Factory.java:59)
>>>>>>>>>>   at
>>>>>>> 
>>>>> org.apache.calcite.jdbc.CalciteJdbc41Factory.newConnection(CalciteJdbc41Factory.java:44)
>>>>>>>>>>   at
>>>>>>> 
>>>>> org.apache.calcite.jdbc.CalciteFactory.newConnection(CalciteFactory.java:53)
>>>>>>>>>>   at
>>>>>>> 
>>>>> org.apache.calcite.avatica.UnregisteredDriver.connect(UnregisteredDriver.java:138)
>>>>>>>>>>   at java.sql.DriverManager.getConnection(DriverManager.java:664)
>>>>>>>>>>   at java.sql.DriverManager.getConnection(DriverManager.java:208)
>>>>>>>>>>   at
>>>>>>> org.apache.calcite.tools.Frameworks.withPrepare(Frameworks.java:145)
>>>>>>>>>>   at
>>>>>>> org.apache.calcite.tools.Frameworks.withPlanner(Frameworks.java:106)
>>>>>>>>>>   at
>>>>>>> 
>>>>> org.apache.hadoop.hive.ql.parse.CalcitePlanner.logicalPlan(CalcitePlanner.java:1069)
>>>>>>>>>>   at
>>>>>>> 
>>>>> org.apache.hadoop.hive.ql.parse.CalcitePlanner.getOptimizedAST(CalcitePlanner.java:1085)
>>>>>>>>>>   at
>>>>>>> 
>>>>> org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:364)
>>>>>>>>>>   at
>>>>>>> 
>>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:11138)
>>>>>>>>>>   at
>>>>>>> 
>>>>> org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:286)
>>>>>>>>>>   at
>>>>>>> 
>>>>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:258)
>>>>>>>>>>   at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:512)
>>>>>>>>>>   at
>>>>>>> org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
>>>>>>>>>>   at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
>>>>>>>>>>   at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
>>>>>>>>>>   at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
>>>>>>>>>>   at
>>>>>>> 
>>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
>>>>>>>>>>   at
>>>>>>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
>>>>>>>>>>   at
>>>>>>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
>>>>>>>>>>   at
>>>>>>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336)
>>>>>>>>>>   at
>>>>>>> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:787)
>>>>>>>>>>   at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
>>>>>>>>>>   at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
>>>>>>>>>>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>>>>   at
>>>>>>> 
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>>>>>>>   at
>>>>>>> 
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>>>>   at java.lang.reflect.Method.invoke(Method.java:498)
>>>>>>>>>>   at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
>>>>>>>>>>   at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
>>>>>>>>>> The command is:
>>>>>>>>>> hive -e "USE default;
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>>> --
>>>>>> Best regards,
>>>>>> 
>>>>>> Shaofeng Shi 史少锋
>>>>> 
>>>>> 
>>>>> 
>>>> 
>>>> --
>>>> Best regards,
>>>> 
>>>> Shaofeng Shi 史少锋
>>>> 
>>>> 
>>>> 
>>> 
>>> -- 
>>> Best regards,
>>> 
>>> Shaofeng Shi 史少锋
>> 
> 
> 
> 
> -- 
> Best regards,
> 
> Shaofeng Shi 史少锋
> 

Reply via email to