Hi, @Scott, sorry to make you confuse. Pradeep's answer works according to my test. Thanks @Pradeep's help!
On Thu, May 26, 2016 at 1:08 PM, Pradeep Chhetri < [email protected]> wrote: > Hi Scott, > > I think setting the HADOOP_HOME env variable in application definition > will not work. You need to set the environment variable in such a way so > that mesos slave process can see it. > > In order to achieve that, you have two options: > > * Either, pass the --hadoop_home flag while starting the mesos slave > daemon. > * Or, set the environment variable HADOOP_HOME before starting mesos slave > daemon so that it can refer it. > > Let us know how it goes. > > On Thu, May 26, 2016 at 9:43 AM, Scott Kinney <[email protected]> > wrote: > >> Here is my app def: >> >> https://gist.github.com/skinney6/a63ff7f0f8311faaabaf0399702a403f >> >> >> >> >> ------------------------------ >> Scott Kinney | DevOps >> stem <http://www.stem.com/> | *m* 510.282.1299 >> 100 Rollins Road, Millbrae, California 94030 >> >> This e-mail and/or any attachments contain Stem, Inc. confidential and >> proprietary information and material for the sole use of the intended >> recipient(s). Any review, use or distribution that has not been expressly >> authorized by Stem, Inc. is strictly prohibited. If you are not the >> intended recipient, please contact the sender and delete all copies. Thank >> you. >> ------------------------------ >> *From:* haosdent <[email protected]> >> *Sent:* Wednesday, May 25, 2016 8:42 PM >> *To:* user >> *Subject:* Re: Hadoop install location to use s3 uri >> >> It looks like could not real the HADOOP_HOME correctly. Otherwise the >> error message would be "/path/to/unpacked/hadoop/bin/hadoop version 2>&1". >> May you show your Marathon application definition? >> >> On Thu, May 26, 2016 at 11:31 AM, Scott Kinney <[email protected]> >> wrote: >> >>> I want to use the s3 uri but i guess i need hadoop on the slave. I've >>> unpacked the hadoop tar ball and >>> added 'HADOOP_HOME=/path/to/unpacked/hadoop' to marathons app definition's >>> environment but mesos still says it can't find hadoop. >>> >>> >>> Failed to fetch 's3n://bucket/docker.tar.gz': Failed to create HDFS >>> client: Failed to execute 'hadoop version 2>&1'; the command was either not >>> found or exited with a non-zero exit status: 127 >>> >>> >>> Also, is the s3 uri correct? s3n://bucketname/keyname ? >>> >>> >>> Thanks! >>> >>> ------------------------------ >>> Scott Kinney | DevOps >>> stem <http://www.stem.com/> | *m* 510.282.1299 >>> 100 Rollins Road, Millbrae, California 94030 >>> >>> This e-mail and/or any attachments contain Stem, Inc. confidential and >>> proprietary information and material for the sole use of the intended >>> recipient(s). Any review, use or distribution that has not been expressly >>> authorized by Stem, Inc. is strictly prohibited. If you are not the >>> intended recipient, please contact the sender and delete all copies. Thank >>> you. >>> >> >> >> >> -- >> Best Regards, >> Haosdent Huang >> > > > > -- > Regards, > Pradeep Chhetri > -- Best Regards, Haosdent Huang

