>
> I haven't tried union distinct, but I assume the same rule applies.
>

Reasonable assumption, so I'll remove "ALL" and see if anyone contradicts
it.  (Tech writing by successive approximation.)

Thanks again.

-- Lefty

On Sat, Feb 21, 2015 at 6:27 AM, Xuefu Zhang <xzh...@cloudera.com> wrote:

> I haven't tried union distinct, but I assume the same rule applies.
>
> Thanks for putting it together. It looks good to me.
>
> --Xuefu
>
> On Fri, Feb 20, 2015 at 11:44 PM, Lefty Leverenz <leftylever...@gmail.com>
> wrote:
>
>> Great, thanks Xuefu.  So this only applies to UNION ALL, not UNION
>> DISTINCT?  I had wondered about that.
>>
>> I made the changes and added some subheadings:  Union wikidoc
>> <https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Union>
>>  -- Column Aliases for UNION ALL
>> <https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Union#LanguageManualUnion-ColumnAliasesforUNIONALL>
>> .
>>
>> Please review it one more time.
>>
>> -- Lefty
>>
>> On Fri, Feb 20, 2015 at 7:06 AM, Xuefu Zhang <xzh...@cloudera.com> wrote:
>>
>>> Hi Lefty,
>>>
>>> The description seems good to me. I just slightly modified it so that it
>>> sounds more "technical", for your consideration.
>>>
>>> Thanks,
>>> Xuefu
>>>
>>> UNION ALL expected the same schema on both sides of the expression list.
>>> As a result, the following query may fail with an error message such as
>>> "FAILED: SemanticException 4:47 Schema of both sides of union should
>>> match."
>>> [query]
>>> In such cases, column aliases can be used to force equal schema:
>>> [corrected query]
>>>
>>>
>>>
>>> On Thu, Feb 19, 2015 at 1:04 AM, Lefty Leverenz <leftylever...@gmail.com
>>> > wrote:
>>>
>>>> Xuefu, I've taken a stab at documenting this in the Union wikidoc
>>>> <https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Union> 
>>>> (near
>>>> the end).  Would you please review it and make any necessary corrections or
>>>> additions?
>>>>
>>>> Thanks.
>>>>
>>>> -- Lefty
>>>>
>>>> On Mon, Feb 2, 2015 at 2:02 PM, DU DU <will...@gmail.com> wrote:
>>>>
>>>>> This is a part of standard SQL syntax, isn't it?
>>>>>
>>>>> On Mon, Feb 2, 2015 at 2:22 PM, Xuefu Zhang <xzh...@cloudera.com>
>>>>> wrote:
>>>>>
>>>>>> Yes, I think it would be great if this can be documented.
>>>>>>
>>>>>> --Xuefu
>>>>>>
>>>>>> On Sun, Feb 1, 2015 at 6:34 PM, Lefty Leverenz <
>>>>>> leftylever...@gmail.com> wrote:
>>>>>>
>>>>>>> Xuefu, should this be documented in the Union wikidoc
>>>>>>> <https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Union>
>>>>>>> ?
>>>>>>>
>>>>>>> Is it relevant for other query clauses?
>>>>>>>
>>>>>>> -- Lefty
>>>>>>>
>>>>>>> On Sun, Feb 1, 2015 at 11:27 AM, Philippe Kernévez <
>>>>>>> pkerne...@octo.com> wrote:
>>>>>>>
>>>>>>>> Perfect.
>>>>>>>>
>>>>>>>> Thank you Xuefu.
>>>>>>>>
>>>>>>>> Philippe
>>>>>>>>
>>>>>>>> On Fri, Jan 30, 2015 at 11:32 PM, Xuefu Zhang <xzh...@cloudera.com>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> Use column alias:
>>>>>>>>>
>>>>>>>>> INSERT OVERWRITE TABLE all_dictionaries_ext
>>>>>>>>>      SELECT name, id, category FROM dictionary
>>>>>>>>>      UNION ALL SELECT NAME, ID, "CAMPAIGN" as category FROM
>>>>>>>>> md_campaigns
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, Jan 30, 2015 at 1:41 PM, Philippe Kernévez <
>>>>>>>>> pkerne...@octo.com> wrote:
>>>>>>>>>
>>>>>>>>>> Hi all,
>>>>>>>>>>
>>>>>>>>>> I would like to do union all with a field that is hardcoded in
>>>>>>>>>> the request.
>>>>>>>>>>
>>>>>>>>>>    INSERT OVERWRITE TABLE all_dictionaries_ext
>>>>>>>>>>      SELECT name, id, category FROM dictionary
>>>>>>>>>>      UNION ALL SELECT NAME, ID, "CAMPAIGN" FROM md_campaigns
>>>>>>>>>>
>>>>>>>>>> Name type is String
>>>>>>>>>> Id type is int
>>>>>>>>>> Category type is string
>>>>>>>>>>
>>>>>>>>>> When I run this command I had an error :
>>>>>>>>>> FAILED: SemanticException 4:47 Schema of both sides of union
>>>>>>>>>> should match. _u1-subquery2 does not have the field category. Error
>>>>>>>>>> encountered near token 'md_campaigns'
>>>>>>>>>>
>>>>>>>>>> I supposed that the error is cause by the String "CAMPAIGN" which
>>>>>>>>>> should not have a type.
>>>>>>>>>>
>>>>>>>>>> How can do this kind of union ?
>>>>>>>>>>
>>>>>>>>>> The union all with 2 hard coded fields is ok.
>>>>>>>>>>   INSERT OVERWRITE TABLE all_dictionaries_ext
>>>>>>>>>>     SELECT NAME, ID, "CAMPAIGN" FROM md_campaigns
>>>>>>>>>>      UNION ALL SELECT NAME, ID, "AD_SERVER" FROM md_ad_servers
>>>>>>>>>>      UNION ALL SELECT NAME, ID, "AVERTISER" FROM md_advertisers
>>>>>>>>>>      UNION ALL SELECT NAME, ID, "AGENCIES" FROM md_agencies
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> More debug info :
>>>>>>>>>>
>>>>>>>>>> 15/01/30 22:34:23 [main]: INFO parse.ParseDriver: Parsing
>>>>>>>>>> command:
>>>>>>>>>>   INSERT OVERWRITE TABLE all_dictionaries_ext
>>>>>>>>>>     SELECT name, id, category FROM byoa_dictionary
>>>>>>>>>>     UNION ALL SELECT NAME, ID, "CAMPAIGN" FROM md_campaigns
>>>>>>>>>> 15/01/30 22:34:23 [main]: INFO parse.ParseDriver: Parse Completed
>>>>>>>>>> 15/01/30 22:34:23 [main]: INFO log.PerfLogger: </PERFLOG
>>>>>>>>>> method=parse start=1422653663887 end=1422653663900 duration=13
>>>>>>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>>>>>> 15/01/30 22:34:23 [main]: INFO log.PerfLogger: <PERFLOG
>>>>>>>>>> method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
>>>>>>>>>> 15/01/30 22:34:23 [main]: INFO parse.SemanticAnalyzer: Starting
>>>>>>>>>> Semantic Analysis
>>>>>>>>>> 15/01/30 22:34:23 [main]: INFO parse.SemanticAnalyzer: Completed
>>>>>>>>>> phase 1 of Semantic Analysis
>>>>>>>>>> 15/01/30 22:34:23 [main]: INFO parse.SemanticAnalyzer: Get
>>>>>>>>>> metadata for source tables
>>>>>>>>>> 15/01/30 22:34:23 [main]: INFO parse.SemanticAnalyzer: Get
>>>>>>>>>> metadata for subqueries
>>>>>>>>>> 15/01/30 22:34:23 [main]: INFO parse.SemanticAnalyzer: Get
>>>>>>>>>> metadata for source tables
>>>>>>>>>> 15/01/30 22:34:23 [main]: INFO parse.SemanticAnalyzer: Get
>>>>>>>>>> metadata for subqueries
>>>>>>>>>> 15/01/30 22:34:24 [main]: INFO parse.SemanticAnalyzer: Get
>>>>>>>>>> metadata for destination tables
>>>>>>>>>> 15/01/30 22:34:24 [main]: INFO parse.SemanticAnalyzer: Get
>>>>>>>>>> metadata for source tables
>>>>>>>>>> 15/01/30 22:34:24 [main]: INFO parse.SemanticAnalyzer: Get
>>>>>>>>>> metadata for subqueries
>>>>>>>>>> 15/01/30 22:34:24 [main]: INFO parse.SemanticAnalyzer: Get
>>>>>>>>>> metadata for destination tables
>>>>>>>>>> 15/01/30 22:34:24 [main]: INFO parse.SemanticAnalyzer: Get
>>>>>>>>>> metadata for destination tables
>>>>>>>>>> 15/01/30 22:34:24 [main]: INFO parse.SemanticAnalyzer: Completed
>>>>>>>>>> getting MetaData in Semantic Analysis
>>>>>>>>>> 15/01/30 22:34:24 [main]: INFO parse.SemanticAnalyzer: Not
>>>>>>>>>> invoking CBO because the statement has too few joins
>>>>>>>>>> FAILED: SemanticException 4:47 Schema of both sides of union
>>>>>>>>>> should match. _u1-subquery2 does not have the field category. Error
>>>>>>>>>> encountered near token 'md_campaigns'
>>>>>>>>>> 15/01/30 22:34:24 [main]: ERROR ql.Driver: FAILED:
>>>>>>>>>> SemanticException 4:47 Schema of both sides of union should match.
>>>>>>>>>> _u1-subquery2 does not have the field category. Error encountered 
>>>>>>>>>> near
>>>>>>>>>> token 'md_campaigns'
>>>>>>>>>> org.apache.hadoop.hive.ql.parse.SemanticException: 4:47 Schema of
>>>>>>>>>> both sides of union should match. _u1-subquery2 does not have the 
>>>>>>>>>> field
>>>>>>>>>> category. Error encountered near token 'md_campaigns'
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genUnionPlan(SemanticAnalyzer.java:9007)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9600)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9620)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9607)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10093)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:221)
>>>>>>>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:415)
>>>>>>>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:303)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1067)
>>>>>>>>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1129)
>>>>>>>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1004)
>>>>>>>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:994)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:247)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:199)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:410)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:345)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:733)
>>>>>>>>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)
>>>>>>>>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:616)
>>>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>>>> at
>>>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>>>>>>> at
>>>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>>>>>>>>> at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
>>>>>>>>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
>>>>>>>>>>
>>>>>>>>>> Regards,
>>>>>>>>>> --
>>>>>>>>>> Philippe Kernévez
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>> Philippe Kernévez
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> Directeur technique (Suisse),
>>>>>>>> pkerne...@octo.com
>>>>>>>> +41 79 888 33 32
>>>>>>>>
>>>>>>>> Retrouvez OCTO sur OCTO Talk : http://blog.octo.com
>>>>>>>> OCTO Technology http://www.octo.com
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Thanks,
>>>>> Dayong
>>>>>
>>>>
>>>>
>>>
>>
>

Reply via email to