The hive SQL in invalid at:

..
WHERE (where cub_type = 'gain_amt' and pt_log_d='20151229')
..

There are two "where"; you may need edit the cube's "filter condition",
remove the "where" from there.


2015-12-31 16:31 GMT+08:00 胡志华(万里通科技及数据中心商务智能团队数据分析组) <
[email protected]>:

> Hi all
>
>
>
>        This is my new cube, but stop at step 1.
>
> I put the SQL into hive:
>
>
>
> What’s the matter with it?
>
>
>
> ParseException line 16:7 cannot recognize input near 'where' 'cub_type'
> '=' in expression specification
>
>
>
>
>
>
>
> hive> INSERT OVERWRITE TABLE
> kylin_intermediate_cub_wlt_partner_gain_test_20151229000000_20151230000000_b3d40d8b_602d_4f02_b02b_5da60a1e298d
> SELECT
>
>     > CUB_PARTNER_GAIN_PAY_PT_PRE0_AT0_S.PARTNER_ID
>
>     > ,CUB_PARTNER_GAIN_PAY_PT_PRE0_AT0_S.BRAND_POINT_NO
>
>     > ,CUB_PARTNER_GAIN_PAY_PT_PRE0_AT0_S.BINDING_M
>
>     > ,CUB_PARTNER_GAIN_PAY_PT_PRE0_AT0_S.PHONE_PROVINCE_IND
>
>     > ,CUB_PARTNER_GAIN_PAY_PT_PRE0_AT0_S.PHONE_CITY_IND
>
>     > ,CUB_PARTNER_GAIN_PAY_PT_PRE0_AT0_S.POINT_CURRENT_LEVEL_IND
>
>     > ,CUB_PARTNER_GAIN_PAY_PT_PRE0_AT0_S.PARTNER_GAIN_PT_LEVEL_IND
>
>     > ,CUB_PARTNER_GAIN_PAY_PT_PRE0_AT0_S.PROCESS_M
>
>     > ,CUB_PARTNER_GAIN_PAY_PT_PRE0_AT0_S.RULE_NAME
>
>     > ,CUB_PARTNER_GAIN_PAY_PT_PRE0_AT0_S.PATHWAY_DESC
>
>     > ,CUB_PARTNER_GAIN_PAY_PT_PRE0_AT0_S.PARTNER_GAIN_PAY_PT_DOC_CNT
>
>     > ,CUB_PARTNER_GAIN_PAY_PT_PRE0_AT0_S.PARTNER_GAIN_PAY_PT_ORDER_CNT
>
>     > ,CUB_PARTNER_GAIN_PAY_PT_PRE0_AT0_S.PARTNER_GAIN_PAY_PT_AMT
>
>     > FROM WLT_PARTNER.CUB_PARTNER_GAIN_PAY_PT_PRE0_AT0_S as
> CUB_PARTNER_GAIN_PAY_PT_PRE0_AT0_S
>
>     > WHERE (where cub_type = 'gain_amt' and pt_log_d='20151229')  AND
> (CUB_PARTNER_GAIN_PAY_PT_PRE0_AT0_S.PT_LOG_D >= '20151229' AND
> CUB_PARTNER_GAIN_PAY_PT_PRE0_AT0_S.PT_LOG_D < '20151230')
>
>     > ;
>
> NoViableAltException(288@[435:1: precedenceEqualExpression : ( (left=
> precedenceBitwiseOrExpression -> $left) ( ( KW_NOT
> precedenceEqualNegatableOperator notExpr= precedenceBitwiseOrExpression )
> -> ^( KW_NOT ^( precedenceEqualNegatableOperator $precedenceEqualExpression
> $notExpr) ) | ( precedenceEqualOperator equalExpr=
> precedenceBitwiseOrExpression ) -> ^( precedenceEqualOperator
> $precedenceEqualExpression $equalExpr) | ( KW_NOT KW_IN LPAREN KW_SELECT
> )=> ( KW_NOT KW_IN subQueryExpression ) -> ^( KW_NOT ^( TOK_SUBQUERY_EXPR
> ^( TOK_SUBQUERY_OP KW_IN ) subQueryExpression $precedenceEqualExpression) )
> | ( KW_NOT KW_IN expressions ) -> ^( KW_NOT ^( TOK_FUNCTION KW_IN
> $precedenceEqualExpression expressions ) ) | ( KW_IN LPAREN KW_SELECT )=> (
> KW_IN subQueryExpression ) -> ^( TOK_SUBQUERY_EXPR ^( TOK_SUBQUERY_OP KW_IN
> ) subQueryExpression $precedenceEqualExpression) | ( KW_IN expressions ) ->
> ^( TOK_FUNCTION KW_IN $precedenceEqualExpression expressions ) | ( KW_NOT
> KW_BETWEEN (min= precedenceBitwiseOrExpression ) KW_AND (max=
> precedenceBitwiseOrExpression ) ) -> ^( TOK_FUNCTION Identifier["between"]
> KW_TRUE $left $min $max) | ( KW_BETWEEN (min= precedenceBitwiseOrExpression
> ) KW_AND (max= precedenceBitwiseOrExpression ) ) -> ^( TOK_FUNCTION
> Identifier["between"] KW_FALSE $left $min $max) )* | ( KW_EXISTS LPAREN
> KW_SELECT )=> ( KW_EXISTS subQueryExpression ) -> ^( TOK_SUBQUERY_EXPR ^(
> TOK_SUBQUERY_OP KW_EXISTS ) subQueryExpression ) );])
>
>         at org.antlr.runtime.DFA.noViableAlt(DFA.java:158)
>
>         at org.antlr.runtime.DFA.predict(DFA.java:116)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceEqualExpression(HiveParser_IdentifiersParser.java:8155)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceNotExpression(HiveParser_IdentifiersParser.java:9177)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceAndExpression(HiveParser_IdentifiersParser.java:9296)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceOrExpression(HiveParser_IdentifiersParser.java:9455)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.expression(HiveParser_IdentifiersParser.java:6105)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.atomExpression(HiveParser_IdentifiersParser.java:6312)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceFieldExpression(HiveParser_IdentifiersParser.java:6383)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceUnaryPrefixExpression(HiveParser_IdentifiersParser.java:6768)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceUnarySuffixExpression(HiveParser_IdentifiersParser.java:6828)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceBitwiseXorExpression(HiveParser_IdentifiersParser.java:7012)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceStarExpression(HiveParser_IdentifiersParser.java:7172)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedencePlusExpression(HiveParser_IdentifiersParser.java:7332)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceAmpersandExpression(HiveParser_IdentifiersParser.java:7483)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceBitwiseOrExpression(HiveParser_IdentifiersParser.java:7634)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceEqualExpression(HiveParser_IdentifiersParser.java:8164)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceNotExpression(HiveParser_IdentifiersParser.java:9177)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceAndExpression(HiveParser_IdentifiersParser.java:9296)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.precedenceOrExpression(HiveParser_IdentifiersParser.java:9455)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.expression(HiveParser_IdentifiersParser.java:6105)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser.expression(HiveParser.java:45840)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser_FromClauseParser.searchCondition(HiveParser_FromClauseParser.java:6637)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser_FromClauseParser.whereClause(HiveParser_FromClauseParser.java:6545)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser.whereClause(HiveParser.java:45876)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser.selectStatement(HiveParser.java:41543)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser.regularBody(HiveParser.java:41230)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser.queryStatementExpressionBody(HiveParser.java:40413)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser.queryStatementExpression(HiveParser.java:40283)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:1590)
>
>         at
> org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1109)
>
>         at
> org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:202)
>
>         at
> org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:166)
>
>         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:396)
>
>         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:308)
>
>         at
> org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1122)
>
>         at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1170)
>
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
>
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
>
>         at
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213)
>
>         at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)
>
>         at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)
>
>         at
> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:736)
>
>         at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
>
>         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
>
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
>         at java.lang.reflect.Method.invoke(Method.java:606)
>
>         at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
>
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
>
> FAILED: ParseException line 16:7 cannot recognize input near 'where'
> 'cub_type' '=' in expression specification
>
> hive>
>
>
>
>
>
>
>
>
>
> **************************************
>
> *胡志华* *[image: 说明: 说明: cid:[email protected]]*
>
> 万里通支持中心大数据团队数据分析组
>
> (:021-20667416/18019788229
>
> *:上海徐汇区凯滨路166号平安大厦B座9楼
>
> **************************************
>
>
>
>
>
>
>
>
>
> ********************************************************************************************************************************
> The information in this email is confidential and may be legally
> privileged. If you have received this email in error or are not the
> intended recipient, please immediately notify the sender and delete this
> message from your computer. Any use, distribution, or copying of this email
> other than by the intended recipient is strictly prohibited. All messages
> sent to and from us may be monitored to ensure compliance with internal
> policies and to protect our business.
> Emails are not secure and cannot be guaranteed to be error free as they
> can be intercepted, amended, lost or destroyed, or contain viruses. Anyone
> who communicates with us by email is taken to accept these risks.
>
> 收发邮件者请注意:
> 本邮件含保密信息,若误收本邮件,请务必通知发送人并直接删去,不得使用、传播或复制本邮件。
> 进出邮件均受到本公司合规监控。邮件可能发生被截留、被修改、丢失、被破坏或包含计算机病毒等不安全情况。
>
> ********************************************************************************************************************************
>



-- 
Best regards,

Shaofeng Shi
  • error whe... 万里通科技及数据中心商务智能团队数据分析组
    • Re: e... ShaoFeng Shi
      • 答... 万里通科技及数据中心商务智能团队数据分析组

Reply via email to