[ 
https://issues.apache.org/jira/browse/AVRO-1170?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13458725#comment-13458725
 ] 

Tom White commented on AVRO-1170:
---------------------------------

Here are the compilation failures:

{noformat}
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-compiler-plugin:2.3.2:compile (default-compile) 
on project avro-mapred: Compilation failure: Compilation failure:
[ERROR] 
/Users/tom/workspace/avro-trunk/lang/java/mapred/src/main/java/org/apache/hadoop/io/SequenceFileBase.java:[44,6]
 cannot find symbol
[ERROR] symbol  : constructor 
BlockCompressWriter(org.apache.hadoop.fs.FileSystem,org.apache.hadoop.conf.Configuration,org.apache.hadoop.fs.Path,java.lang.Class,java.lang.Class,int,short,long,org.apache.hadoop.io.compress.CompressionCodec,org.apache.hadoop.util.Progressable,org.apache.hadoop.io.SequenceFile.Metadata)
[ERROR] location: class org.apache.hadoop.io.SequenceFile.BlockCompressWriter
[ERROR] 
/Users/tom/workspace/avro-trunk/lang/java/mapred/src/main/java/org/apache/hadoop/io/SequenceFileBase.java:[58,6]
 cannot find symbol
[ERROR] symbol  : constructor 
RecordCompressWriter(org.apache.hadoop.fs.FileSystem,org.apache.hadoop.conf.Configuration,org.apache.hadoop.fs.Path,java.lang.Class,java.lang.Class,int,short,long,org.apache.hadoop.io.compress.CompressionCodec,org.apache.hadoop.util.Progressable,org.apache.hadoop.io.SequenceFile.Metadata)
[ERROR] location: class org.apache.hadoop.io.SequenceFile.RecordCompressWriter
[ERROR] 
/Users/tom/workspace/avro-trunk/lang/java/mapred/src/main/java/org/apache/avro/mapreduce/AvroMultipleOutputs.java:[425,37]
 org.apache.hadoop.mapreduce.TaskAttemptContext is abstract; cannot be 
instantiated
[ERROR] 
/Users/tom/workspace/avro-trunk/lang/java/mapred/src/main/java/org/apache/avro/mapreduce/AvroMultipleOutputs.java:[498,18]
 org.apache.hadoop.mapreduce.TaskAttemptContext is abstract; cannot be 
instantiated
{noformat}

The first two are because the constructors for SequenceFile.BlockCompressWriter 
and SequenceFile.RecordCompressWriter have changed between Hadoop 1 and 2. I'll 
file a Hadoop JIRA for this.

The second two are because of the change in TaskAttemptContext. This can be 
solved via reflection and separate Maven artifacts for the mapred JAR. The same 
problem was fixed in MRUnit, see MRUNIT-31 and MRUNIT-56 for some background.
                
> Avro's new mapreduce APIs don't work with Hadoop 2
> --------------------------------------------------
>
>                 Key: AVRO-1170
>                 URL: https://issues.apache.org/jira/browse/AVRO-1170
>             Project: Avro
>          Issue Type: Bug
>          Components: java
>    Affects Versions: 1.7.1
>            Reporter: Tom White
>
> Avro does not compile against Hadoop since some interfaces were changed to 
> classes between Hadoop 1 and 2 (e.g. TaskAttemptContext).

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Reply via email to