07.314.0122 mobile
>
> *CI Boost™ Clients* *Outperform Online™ *www.ciboost.com
>
> ** **
>
> *From:* Soheila Dehghanzadeh [mailto:sally...@gmail.com]
> *Sent:* Wednesday, August 01, 2012 8:01 AM
> *To:* hdfs-user@hadoop.apache.org
> *Subject:* moving data
Wednesday, August 01, 2012 8:01 AM
To: hdfs-user@hadoop.apache.org
Subject: moving data
Hi All,
I want to move my data from my server into hadoop file system to run a jar file
using this data on a hadoop cluster.
I have 2 questions.
if my java program which I want to run on a hadoop cluster ref
Hi All,
I want to move my data from my server into hadoop file system to run a jar
file using this data on a hadoop cluster.
I have 2 questions.
if my java program which I want to run on a hadoop cluster refers to an
address on another server, how i should specify the address in my java
program?
o
Hello Harsh,
Thanks a lot for the valuable reply.I have to go with the 2nd option
as there is no access for the remote machine to our HDFS.I'll go
through the link specified by you and act accordingly.
Regards,
Mohammad Tariq
On Wed, Jan 25, 2012 at 12:49 AM, Harsh J wrote:
> You have t
You have two ways:
A. If remote node has access to all HDFS machines (NN + all DNs).
Simply do a "hadoop dfs -put" operation to push in data.
B. If remote node has no access to HDFS, setup a bastion box with Hoop
and write to HDFS via Hoop. Hoop provides a REST API to do this.
Some examples to
Hey Ron,
Thanks for the response.No, the remote machine is not a part of our
Hadoop ecosystem.
Regards,
Mohammad Tariq
On Tue, Jan 24, 2012 at 10:23 PM, Ronald Petty wrote:
> Mohammed,
>
> Is this remote machine part of the HDFS system?
>
> Ron
>
>
> On Tue, Jan 24, 2012 at 7:30 AM, Mo
Mohammed,
Is this remote machine part of the HDFS system?
Ron
On Tue, Jan 24, 2012 at 7:30 AM, Mohammad Tariq wrote:
> Hello list,
>
>I have a situation wherein I have to move large binary files(~TB)
> from remote machines into the HDFS.While looking for some way to do
> this I came across
Hello list,
I have a situation wherein I have to move large binary files(~TB)
from remote machines into the HDFS.While looking for some way to do
this I came across Hoop.Could anyone tell me whether it fits into my
use case?If so where can I find some proper help so that I can learn
about Hoop
Hi Steve,
You can use Chukwa (A Hadoop sub project that aims to provide a
flexible and powerful platform for distributed data collection).
Chukwa makes our data collection quite simple and efficient.
Regards,
Mohammad Tariq
On Wed, Nov 23, 2011 at 12:58 AM, Steve Ed wrote:
> Sorry for
You can use the linux command hadoop fs -put to push files from local
filesystem, and -get to retrieve files
http://hadoop.apache.org/common/docs/r0.20.0/hdfs_shell.html#put
http://hadoop.apache.org/common/docs/r0.20.0/hdfs_shell.html#get
These work fine for single files or one-offs- if you need
Sorry for this novice question. I am trying to find the best way of moving
(Copying) data in and out of HDFS. There are bunch of tools available and I
need to pick the one which offers the easiest way. I have seen MapR
presentation, who claim to offer direct NFS mounts to feed data into HDFS.
Is
11 matches
Mail list logo