Thanks Mohammad. I reviewed the HIVE-6784 patch and it has performance penalties because the way is implemented.
I think we can do the promotion in a different way to avoid such penalties. If you take a look at ParquetStringInspector.java, this class gets a String value from different string writables (BytesWritable, Text or String). We may do something like that to return a Long from Integer and Short types. However, I am a little worried about the little overhead we will add by checking the writable type with 'instance of' everytime we get the value. I'll review the object inspector and ETypeConverter.java (parquet type coverter) to see if there's a better way to do the promotion. - Sergio On Wed, Oct 7, 2015 at 9:37 PM, Mohammad Islam <misla...@yahoo.com.invalid> wrote: > > > Hi Sergio, > Thanks for your reply. > > I found one such effort : > https://issues.apache.org/jira/browse/HIVE-6784 > > I consider to try it differently. > Similar stuffs in ORC: > https://issues.apache.org/jira/browse/HIVE-10591 > > > Regards, > Mohammad > > > On Wednesday, October 7, 2015 8:58 AM, Sergio Pena < > sergio.p...@cloudera.com> wrote: > Hi Mohammad, > > Currently, Hive + Parquet does not support auto casting for wider types. > That is be a very good idea to implement in Hive. > I'll investigate the hive + parquet code, and see if it is something we can > add in a future release. > > - Sergio > > > > On Tue, Oct 6, 2015 at 7:23 PM, Mohammad Islam <misla...@yahoo.com.invalid > > > wrote: > > > > > Any hive+parquet user/dev to address this? > > > > > > Regards, > > Mohammad > > > > On Monday, October 5, 2015 3:41 PM, Mohammad Islam <misla...@yahoo.com> > > wrote: > > > > > > > > Hi, > > Does the parquet table support auto casting to wider data types? For > > example, If I have a parquet table where some parquet data files which > have > > "int" as data type and other files have "long" data type for the same > > field. > > > > The table schema has type "bigint" for the same field. > > Does hive can read the file that was written with type "int"? > > > > I got this exception "Failed with exception > > java.io.IOException:org.apache.hadoop.hive.ql.metadata.HiveException: > > java.lang.ClassCastException: org.apache.hadoop.io.IntWritable cannot be > > cast to org.apache.hadoop.io.LongWritable". > > > > Regards, > > Mohammad > > >