Hi Jinye,

When you run that hadoop fs command, you're copying input.log from your
local FS to HDFS, so it should be visible on all of your nodes.

Are you sure that you're pointing the Oozie action to the right place?
When you run hadoop fs -copyFromLocal /home/cloudera/input.log
/user/cloudera/input I believe you are actually uploading the file to
/user/cloudera/input/input.log because you didn't specify a file extension
in the destination path; did you mean to upload it to
/user/cloudera/input.log?  If so, try running hadoop fs -copyFromLocal
/home/cloudera/input.log /user/cloudera/input.log

- Robert


On Tue, Jan 29, 2013 at 11:23 AM, Jinye Luo <[email protected]> wrote:

> Let's say I have a four node cluster, one master and three slaves.
>
> The master is running:
>
>         hadoop-yarn-resourcemanager
>         hadoop-hdfs-namenode
>         hadoop--mapreduce-historyserver
>         zookeeper-server
>         hbase-master
>         oozie
>
>
> The slaves are running:
>         hadoop-yarn-nodemanager
>         hadoop-hdfs-datanode
>         hbase-regionserver
>
> From the master, I run the following to put a file into hdfs, then kick
> off a M/R job:
>         hadoop fs -copyFromLocal /home/cloudera/input.log
> /user/cloudera/input
>
>         hadoop jar log-processing.jar ...
>
> Obviously, input.log is not visible to slaves, and Oozie action is always
> executed in slaves, hence the Oozie action will always fail.
>
> -----Original Message-----
> From: Mohammad Islam [mailto:[email protected]]
> Sent: Tuesday, January 29, 2013 1:02 PM
> To: [email protected]
> Subject: Re: Is it possible to use Oozie Java/Shell action to put a file
> into hdfs?
>
> Hi Jinye,
> Where is your source file (hdfs/local)? Would you please copy paste the
> exact/equivalent command that you want to execute?
>
> Regards,
> Mohammad
>
>
> ________________________________
>  From: Jinye Luo <[email protected]>
> To: Mona Chitnis <[email protected]>; "[email protected]" <
> [email protected]>; Mohammad Islam <[email protected]>
> Sent: Monday, January 28, 2013 9:31 PM
> Subject: RE: Is it possible to use Oozie Java/Shell action to put a file
> into hdfs?
>
> Mohammad/Mona, Thx for your reply.
>
> What I did is pretty much the same as what in the Cookbook, without having
> security enabled.
>
> I think/hope I found what went wrong. The Cookbook reads "Assume a local
> file $filename can be accessed by all cluster nodes." It is certainly not
> the case for me. In my clustered environment, oozie is running in an "edge
> node", which is the only one with access to (local) input files, and all
> other nodes, where M/R are running, are not able to see this file.
>
> So I am out of luck with oozie, or there is a way to get around it?
>
> -----Original Message-----
> From: Mona Chitnis [mailto:[email protected]]
> Sent: Monday, January 28, 2013 12:37 PM
> To: [email protected]; Mohammad Islam
> Subject: Re: Is it possible to use Oozie Java/Shell action to put a file
> into hdfs?
>
> Hello Jinye,
>
> In our Java Cookbook doc -
> https://cwiki.apache.org/confluence/display/OOZIE/Java+Cookbook
>
> Refer to section "Examples and Use Cases" #1. That should help.
>
> --
> Mona
>
> On 1/26/13 12:06 AM, "Mohammad Islam" <[email protected]> wrote:
>
> >Hi Jinye,
> >Are you using secure Hadoop with Kerberoes? What version?
> >
> >I think both ways (in OOzie 3.2+), you should be able to create a file
> >in HDFS.
> >
> >How did you try? and why they failed with what error ?
> >
> >Regards,
> >Mohammad
> >
> >
> >________________________________
> > From: Jinye Luo <[email protected]>
> >To: "[email protected]" <[email protected]>
> >Sent: Friday, January 25, 2013 2:10 PM
> >Subject: Is it possible to use Oozie Java/Shell action to put a file
> >into hdfs?
> >
> >I have tried a hundred different ways without success, any one has
> >better luck? If so, share a sample please.
>
>

Reply via email to