Thanks Karan,
Forget to mention we already fix this issue in hive side(ignore this field
in reflection serde)
We found out this was caused by HIVE-7892



On Tue, Jul 7, 2015 at 9:03 PM, Karan Kumar <karankumar1...@gmail.com>
wrote:

> Issue is with the thrift version you are using most probably.
> https://issues.apache.org/jira/browse/THRIFT-2172
>
> I used thrift-0.9.2 to generate my thrift classes which solved this kind
> of issue.
>
> On Tue, Jul 7, 2015 at 6:14 PM, Binglin Chang <decst...@gmail.com> wrote:
>
>> Sorry, forgot to mention, the table is using thrift serde, but 'show
>> create table' shows the table is ROW FORMAT DELIMITED, which I think is a
>> bug.
>> When select simple text format table, the query runs fine, but when
>> select  thrift table,  error occurs.
>>
>> original create table statement:
>>
>> CREATE EXTERNAL TABLE xxx(commonUserId
>> STRUCT<id:STRING,idType:STRING>,lastActiveTime BIGINT,searchWords
>> ARRAY<STRING>,calls MAP<STRING,ARRAY<BIGINT>>,hasUsedRecharge
>> INT,hasUsedExpress INT,hasUsedViolateRegulation
>> INT,hasUsedLicensePlateLottery INT,interestedShops ARRAY<STRING>) ROW
>> FORMAT SERDE 'org.apache.hadoop.hive.serde2.thrift.ThriftDeserializer' WITH
>> SERDEPROPERTIES('serialization.format'='org.apache.thrift.protocol.TCompactProtocol','serialization.class'='com.xiaomi.data.spec.platform.xxx')
>> STORED AS SEQUENCEFILE LOCATION 'xxxx'
>>
>>
>>
>>
>> On Tue, Jul 7, 2015 at 7:44 PM, Binglin Chang <decst...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I have a table with some array fields, when preview them using "select
>>> limit" at beeline, I got following errors, it seems the typeinfo string is
>>> changed from array<string> to struct<>
>>> I am using hive-0.13.1
>>>
>>> 0: jdbc:hive2://lg-hadoop-hive01.bj:32203/> show create table xxx;
>>>
>>> +----------------------------------------------------------------------------------------+--+
>>> |                                     createtab_stmt
>>>                 |
>>>
>>> +----------------------------------------------------------------------------------------+--+
>>> | CREATE EXTERNAL TABLE `xxx`(                                    |
>>> |   `commonuserid` struct<id:string,idType:string>,
>>>                  |
>>> |   `lastactivetime` bigint,
>>>                 |
>>> |   `searchwords` array<string>,
>>>                 |
>>> |   `calls` map<string,array<bigint>>,
>>>                 |
>>> |   `hasusedrecharge` int,
>>>                 |
>>> |   `hasusedexpress` int,
>>>                  |
>>> |   `hasusedviolateregulation` int,
>>>                  |
>>> |   `hasusedlicenseplatelottery` int,
>>>                  |
>>> |   `interestedshops` array<string>)
>>>                 |
>>> | ROW FORMAT DELIMITED
>>>                 |
>>> | STORED AS INPUTFORMAT
>>>                  |
>>> |   'org.apache.hadoop.mapred.SequenceFileInputFormat'
>>>                 |
>>> | OUTPUTFORMAT
>>>                 |
>>> |   'org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat'
>>>                  |
>>>
>>>
>>> 0: jdbc:hive2://lg-hadoop-hive01.bj:32203/> select searchwords from
>>>  yellowpage.yp_user_actions limit 1;
>>> Error: Error while compiling statement: FAILED: SemanticException
>>> java.lang.IllegalArgumentException: Error: name expected at the position 7
>>> of 'struct<>' but '>' is found. (state=42000,code=40000)
>>>
>>> Full stack:
>>>
>>> org.apache.hadoop.hive.ql.parse.SemanticException: 
>>> java.lang.IllegalArgumentException: Error: name expected at the position 7 
>>> of 'struct<>' but '>' is found.
>>>     at 
>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genConversionSelectOperator(SemanticAnalyzer.java:5949)
>>>     at 
>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFileSinkPlan(SemanticAnalyzer.java:5845)
>>>     at 
>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPostGroupByBodyPlan(SemanticAnalyzer.java:8235)
>>>     at 
>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:8126)
>>>     at 
>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:8956)
>>>     at 
>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9209)
>>>     at 
>>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:206)
>>>     at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:435)
>>>     at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:333)
>>>     at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:989)
>>>     at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:982)
>>>     at 
>>> org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:123)
>>>     at 
>>> org.apache.hive.service.cli.operation.SQLOperation.run(SQLOperation.java:197)
>>>     at 
>>> org.apache.hive.service.cli.session.HiveSessionImpl.runOperationWithLogCapture(HiveSessionImpl.java:734)
>>>     at 
>>> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:376)
>>>     at 
>>> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:362)
>>>     at 
>>> org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:240)
>>>     at 
>>> org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:378)
>>>     at 
>>> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1373)
>>>     at 
>>> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1358)
>>>     at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>>>     at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>>>     at 
>>> org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge20S$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge20S.java:677)
>>>     at 
>>> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:244)
>>>     at 
>>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>>>     at 
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>>>     at java.lang.Thread.run(Thread.java:662)
>>> Caused by: java.lang.IllegalArgumentException: Error: name expected at the 
>>> position 7 of 'struct<>' but '>' is found.
>>>     at 
>>> org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils$TypeInfoParser.expect(TypeInfoUtils.java:354)
>>>     at 
>>> org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils$TypeInfoParser.expect(TypeInfoUtils.java:331)
>>>     at 
>>> org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils$TypeInfoParser.parseType(TypeInfoUtils.java:478)
>>>     at 
>>> org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils$TypeInfoParser.parseTypeInfos(TypeInfoUtils.java:305)
>>>     at 
>>> org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils.getTypeInfosFromTypeString(TypeInfoUtils.java:754)
>>>     at 
>>> org.apache.hadoop.hive.serde2.lazy.LazyUtils.extractColumnInfo(LazyUtils.java:372)
>>>     at 
>>> org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.initSerdeParams(LazySimpleSerDe.java:288)
>>>     at 
>>> org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.initialize(LazySimpleSerDe.java:187)
>>>     at 
>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genConversionSelectOperator(SemanticAnalyzer.java:5946)
>>>     ... 26 more
>>>
>>>
>>>
>>> Thanks,
>>> Binglin
>>>
>>
>>
>
>
> --
> Thanks
> Karan
>

Reply via email to