[ 
https://issues.apache.org/jira/browse/PHOENIX-2869?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15265008#comment-15265008
 ] 

Anil Gupta commented on PHOENIX-2869:
-------------------------------------

Unfortunately, we cant try it ourselves since we are using HDP package. I can 
probably ask HDP for a backport of that patch. If its already fixed, we can 
close this ticket. Thanks.

> Pig-Phoenix Integration throws errors when we try to write to SmallInt and 
> TinyInt columns
> ------------------------------------------------------------------------------------------
>
>                 Key: PHOENIX-2869
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-2869
>             Project: Phoenix
>          Issue Type: Bug
>    Affects Versions: 4.4.0
>            Reporter: Anil Gupta
>              Labels: Pig
>
> Pig-Phoenix Integration  does not works for SmallInt and TinyInt. We get this 
> kind of error when we run a pig script and load data into a SmallInt or 
> TinyInt column:
> Caused by: java.lang.RuntimeException: Unable to process column 
> TINYINT:"L"."ELIGIBLESALE", innerMessage=java.lang.Integer cannot be coerced 
> to TINYINT
> at 
> org.apache.phoenix.pig.writable.PhoenixPigDBWritable.write(PhoenixPigDBWritable.java:66)
> at 
> org.apache.phoenix.mapreduce.PhoenixRecordWriter.write(PhoenixRecordWriter.java:78)
> at 
> org.apache.phoenix.mapreduce.PhoenixRecordWriter.write(PhoenixRecordWriter.java:39)
> at 
> org.apache.phoenix.pig.PhoenixHBaseStorage.putNext(PhoenixHBaseStorage.java:184)
> at 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:136)
> at 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:95)
> at 
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:658)
> at 
> org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
> at 
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
> at 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigMapOnly$Map.collect(PigMapOnly.java:48)
> at 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapBase.runPipeline(PigGenericMapBase.java:281)
> at 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapBase.map(PigGenericMapBase.java:274)
> at 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapBase.map(PigGenericMapBase.java:64)
> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> at java.security.AccessController.doPrivileged(Native Method)
> ... 3 more
> ------------------------------------------------------------------
> Some more relevant discussion on mailing list: 
> http://search-hadoop.com/m/9UY0h2HRUMW1WYQEH1



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to