[
https://issues.apache.org/jira/browse/HADOOP-1986?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12570599#action_12570599
]
Mukund Madhugiri commented on HADOOP-1986:
------------------------------------------
Tom,
I picked up the patch to get the benchmarks going and see that the patch
applies file, but fails to compile with trunk:
compile-core-classes:
[javac] Compiling 454 source files to /trunk/build/classes
[javac] /trunk/src/java/org/apache/hadoop/util/CopyFiles.java:820: cannot
find symbol
[javac] symbol : constructor
Sorter(org.apache.hadoop.fs.FileSystem,org.apache.hadoop.io.Text.Comparator,java.lang.Class<org.apache.had
oop.io.Text>,org.apache.hadoop.conf.Configuration)
[javac] location: class org.apache.hadoop.io.SequenceFile.Sorter
[javac] SequenceFile.Sorter sorter = new SequenceFile.Sorter(fs,
[javac] ^
[javac] Note: Some input files use or override a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation for details.
[javac] 1 error
> Add support for a general serialization mechanism for Map Reduce
> ----------------------------------------------------------------
>
> Key: HADOOP-1986
> URL: https://issues.apache.org/jira/browse/HADOOP-1986
> Project: Hadoop Core
> Issue Type: New Feature
> Components: mapred
> Reporter: Tom White
> Assignee: Tom White
> Fix For: 0.17.0
>
> Attachments: hadoop-serializer-v2.tar.gz, SerializableWritable.java,
> serializer-v1.patch, serializer-v2.patch, serializer-v3.patch,
> serializer-v4.patch
>
>
> Currently Map Reduce programs have to use WritableComparable-Writable
> key-value pairs. While it's possible to write Writable wrappers for other
> serialization frameworks (such as Thrift), this is not very convenient: it
> would be nicer to be able to use arbitrary types directly, without explicit
> wrapping and unwrapping.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.