Yes i can do that but I have connected from my mac os terminal to linux
using ssh.
Now when I run LS command it shows me list of files & folders from Linux
and not from Mac OS.
I have files which I need to put onto Hadoop directly from Mac OS.
So something like below.

>From Mac OS Terminal:

[root@sandbox ~]#hadoop fs -put <MAC OS FOLDER PATH/FILE> <HADOOP PATH>

Hope my requirement is clear.

Rgds, Anil




On Thu, Dec 18, 2014 at 9:39 AM, johny casanova <[email protected]>
wrote:
>
> Hi Anil,
>
> you can use the  hadoop fs put "file" or directory and that should add it
> to your hdfs
>
> ------------------------------
> Date: Thu, 18 Dec 2014 09:29:34 +1100
> Subject: Copying files to hadoop.
> From: [email protected]
> To: [email protected]
>
> Dear All,
>
> I'm pretty new to Hadoop technology and Linux environment hence struggling
> even to find solutions for the basic stuff.
>
> For now, Hortonworks Sandbox is working fine for me and i managed to
> connect to it thru SSH.
>
> Now i have some csv files in my mac os folders which i want to copy onto
> Hadoop. As per my knowledge i can copy those files first to Linux and then
> put to Hadoop. But is there a way in which just in one command it will copy
> to Hadoop directly from mac os folder?
>
> Appreciate your advices.
>
> Thank you guys...
>
> Rgds, Anil
>
>

Reply via email to