Thanks. That's what I suspected too.
Yong

> Subject: Re: Parquet Hive NullPointerException
> To: [email protected]
> From: [email protected]
> Date: Wed, 28 Oct 2015 10:38:13 -0700
> 
> I seem to remember something about this bug. The value doesn't get 
> initialized properly in some circumstances. This was fixed quite a long 
> time ago, so I think the right thing to do is to update your Parquet or 
> Hive version.
> 
> rb
> 
> On 10/28/2015 10:33 AM, java8964 wrote:
> > We have a table created in the hive as parquet format, and data ingested 
> > into it through Hive command.
> > The hive version is 0.12.0, and parquet version is 1.3.2.
> > When querying this table in the Hive, the mapper failed with the following 
> > error, is this due to the NULL value in the data?
> > Do we have any jira related to this? Or what version I can expect this bug 
> > fixed in Parquet?
> > I google around, and found out the exactly the same issue discussed here:
> > https://groups.google.com/forum/#!topic/parquet-dev/cxcz2QG7ScY
> > But no result about if it is fixed in the future version.
> > Thanks
> > Yong
> > 2015-10-28 12:58:31,969 ERROR 
> > org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException 
> > as:soddo (auth:SIMPLE) cause:java.io.IOException: java.io.IOException: 
> > java.lang.NullPointerException
> > 2015-10-28 12:58:31,970 WARN org.apache.hadoop.mapred.Child: Error running 
> > child
> > java.io.IOException: java.io.IOException: java.lang.NullPointerException
> >     at 
> > org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderNextException(HiveIOExceptionHandlerChain.java:121)
> >     at 
> > org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderNextException(HiveIOExceptionHandlerUtil.java:77)
> >     at 
> > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.doNextWithExceptionHandler(HadoopShimsSecure.java:304)
> >     at 
> > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.next(HadoopShimsSecure.java:220)
> >     at 
> > org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:234)
> >     at 
> > org.apache.hadoop.mapred.MapTask$TrackedRecordReader.next(MapTask.java:214)
> >     at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:48)
> >     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:434)
> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> >     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> >     at 
> > java.security.AccessController.doPrivileged(AccessController.java:366)
> >     at javax.security.auth.Subject.doAs(Subject.java:572)
> >     at 
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1502)
> >     at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > Caused by: java.io.IOException: java.lang.NullPointerException
> >     at 
> > org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderNextException(HiveIOExceptionHandlerChain.java:121)
> >     at 
> > org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderNextException(HiveIOExceptionHandlerUtil.java:77)
> >     at 
> > org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.doNext(HiveContextAwareRecordReader.java:344)
> >     at 
> > org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.doNext(CombineHiveRecordReader.java:101)
> >     at 
> > org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.doNext(CombineHiveRecordReader.java:41)
> >     at 
> > org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.next(HiveContextAwareRecordReader.java:122)
> >     at 
> > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.doNextWithExceptionHandler(HadoopShimsSecure.java:302)
> >     ... 11 more
> > Caused by: java.lang.NullPointerException
> >     at 
> > parquet.hive.MapredParquetInputFormat$RecordReaderWrapper.next(MapredParquetInputFormat.java:303)
> >     at 
> > parquet.hive.MapredParquetInputFormat$RecordReaderWrapper.next(MapredParquetInputFormat.java:199)
> >     at 
> > org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.doNext(HiveContextAwareRecordReader.java:339)
> >     ... 15 more                                     
> >
> 
> 
> -- 
> Ryan Blue
> Software Engineer
> Cloudera, Inc.
                                          

Reply via email to