[
https://issues.apache.org/jira/browse/HADOOP-1722?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12679532#action_12679532
]
zhuweimin commented on HADOOP-1722:
-----------------------------------
Thanks, Klaas.
I tried -jobconf option and it worked.
But it looks like a part of the content was lost.
The test results are the following.
Do you have any idea what's wrong?
----------
$ bin/hadoop fs -ls data
Found 2 items
-rw-r--r-- 1 hadoop supergroup 67108864 2009-03-06 17:15
/user/hadoop/data/64m_1.dat
-rw-r--r-- 1 hadoop supergroup 67108864 2009-03-06 17:15
/user/hadoop/data/64m_2.dat
$ hadoop jar contrib/streaming/hadoop-0.19.1-streaming.jar
-input data \
-output dataoutput \
-mapper "wc -c" \
-numReduceTasks 0 \
-jobconf stream.map.input=rawbytes
...
09/03/06 17:17:08 INFO streaming.StreamJob: map 0% reduce 0%
09/03/06 17:17:16 INFO streaming.StreamJob: map 100% reduce 0%
09/03/06 17:17:18 INFO streaming.StreamJob: Job complete: job_200903061543_0012
09/03/06 17:17:18 INFO streaming.StreamJob: Output: dataoutput
$ hadoop fs -cat dataoutput/part*
67107830
67107830
----------
> Make streaming to handle non-utf8 byte array
> --------------------------------------------
>
> Key: HADOOP-1722
> URL: https://issues.apache.org/jira/browse/HADOOP-1722
> Project: Hadoop Core
> Issue Type: Improvement
> Components: contrib/streaming
> Reporter: Runping Qi
> Assignee: Klaas Bosteels
> Fix For: 0.21.0
>
> Attachments: HADOOP-1722-branch-0.18.patch,
> HADOOP-1722-branch-0.19.patch, HADOOP-1722-v2.patch, HADOOP-1722-v3.patch,
> HADOOP-1722-v4.patch, HADOOP-1722-v4.patch, HADOOP-1722-v5.patch,
> HADOOP-1722-v6.patch, HADOOP-1722.patch
>
>
> Right now, the streaming framework expects the output sof the steam process
> (mapper or reducer) are line
> oriented UTF-8 text. This limit makes it impossible to use those programs
> whose outputs may be non-UTF-8
> (international encoding, or maybe even binary data). Streaming can overcome
> this limit by introducing a simple
> encoding protocol. For example, it can allow the mapper/reducer to hexencode
> its keys/values,
> the framework decodes them in the Java side.
> This way, as long as the mapper/reducer executables follow this encoding
> protocol,
> they can output arabitary bytearray and the streaming framework can handle
> them.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.