[
https://issues.apache.org/jira/browse/HADOOP-1864?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
ASF GitHub Bot updated HADOOP-1864:
-----------------------------------
Labels: pull-request-available (was: )
> Support for big jar file (>2G)
> ------------------------------
>
> Key: HADOOP-1864
> URL: https://issues.apache.org/jira/browse/HADOOP-1864
> Project: Hadoop Common
> Issue Type: Bug
> Affects Versions: 0.14.1
> Reporter: Yiping Han
> Priority: Critical
> Labels: pull-request-available
>
> We have huge size binary that need to be distributed onto tasktracker nodes
> in Hadoop streaming mode. We've tried both -file option and -cacheArchive
> option. It seems the tasktracker node cannot unjar jar files bigger than 2G.
> We are considering split our binaries into multiple jars, but with -file, it
> seems we cannot do it. Also, we would prefer -cacheArchive option for
> performance issue, but it seems -cacheArchive does not allow more than
> appearance in the streaming options. Even if -cacheArchive support multiple
> jars, we still need a way to put the jars into a single directory tree,
> instead of using multiple symbolic links.
> So, in general, we need a feasible and efficient way to update large size
> (>2G) binaries for Hadoop streaming. Don't know if there is an existing
> solution that we either didn't find or took it wrong. Or there should be some
> extra work to provide a solution?
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]