The s3 fetcher stuff inside of DC/OS is not supported. The `hadoop` binary
has been entirely removed from DC/OS 1.8 already. There have been various
proposals to make it so the mesos fetcher is much more pluggable /
extensible (https://issues.apache.org/jira/browse/MESOS-2731 for instance).

Generally speaking people want a lot of different sorts of fetching, and
there are all sorts of questions of how to properly get auth to the various
chunks (if you're using s3a:// presumably you need to get credentials there
somehow. Otherwise you could just use http://). Need to design / build that
into Mesos and DC/OS to be able to use this stuff.

Cody

On Tue, May 10, 2016 at 9:55 AM Briant, James <[email protected]>
wrote:

> I want to use s3a: urls in fetcher. I’m using dcos 1.7 which has hadoop
> 2.5 on its agents. This version has the necessary hadoop-aws and aws-sdk:
>
> hadoop--afadb46fe64d0ee7ce23dbe769e44bfb0767a8b9]$ ls
> usr/share/hadoop/tools/lib/ | grep aws
> aws-java-sdk-1.7.4.jar
> hadoop-aws-2.5.0-cdh5.3.3.jar
>
> What config/scripts do I need to hack to get these guys on the classpath
> so that "hadoop fs -copyToLocal” works?
>
> Thanks,
> Jamie

Reply via email to