I found the problem. I had changed the table definition of t3. 

 

The stored as  inputformat 

'org.apache.hadoop.mapred.SequenceFileInputFormat'

 

does cause it to read the sequence file, and I had different "stored as"
clauses as I experimented.

 

The key/value of the sequence file does not need to be specified.

 

From: Rodriguez, John [mailto:jrodrig...@verisign.com] 
Sent: Tuesday, September 14, 2010 11:08 PM
To: hive-user@hadoop.apache.org
Subject: question about reading a sequencefile

 

I have a sequence file that has custom Writables in it.

 

For illustration, assume the value in the SequenceFile is like this:

public class DigestWritable implements org.apache.hadoop.io.Writable {

  String str;

  int I;

}

 

 

I wrote a SerDe, expecting that the value passed into deserialize would
be a DigestWritable but I get a class cast exception in

  public Object deserialize(Writable field) throws SerDeException {

   DigestWritable digest = (DigestWritable)writable;  // class cast
exception here

 

Failed with exception java.io.IOException:java.lang.ClassCastException:
org.apache.hadoop.io.Text cannot be cast to DigestWritable

 

Is my create table statement wrong? 

 

Does the "stored as inputformat" part cause Hive to read a SequenceFile?

 

Do I need to specify the key and value of the SeuenceFile?

 

create external table if not exists t3 (

    str string

)

row format 

  serde 'com.utils.DigestSerde'

stored as  inputformat 

'org.apache.hadoop.mapred.SequenceFileInputFormat'

outputformat 

'org.apache.hadoop.mapred.SequenceFileOutputFormat'

location '/user/jr/hiveTables';

Reply via email to