Well in second case :
Tomcat --> webserver --> HDFS ..

my example run here from the shell
hadoop jar HahoopHdfsHello.jar HadooHdfsHello

thats ok..
i try run directly in java line.

i think to write on hdfs , it need a enviroment of hadoop.

In a java aplication or webserver it will work ?
i try run my java example.. and a i have a problema with jar

root@master:~# java -jar HahoopHdfsHello.jar
Failed to load Main-Class manifest attribute from
HahoopHdfsHello.jar

thanks for answer ..


On Thu, Jan 20, 2011 at 3:08 PM, Harsh J <[email protected]> wrote:

> Well if you are trying only to write files to Hadoop's Distributed
> File System, you can simply get that done with (2).
>
> If you are planning to run MapReduce jobs (a good bunch of them), then
> you would need something more specific/elaborate to suit your exact
> needs (Either managing a set of jars, or configuring jobs live from
> the webapp and submitting, etc.).
>
> On Thu, Jan 20, 2011 at 9:29 PM, Alessandro Binhara <[email protected]>
> wrote:
> > Hello..
> >
> > I start a first implementation in hadoop.
> > I see some example how to write on hadoop..
> >
> > i have a question ...
> > I have  java code to  write on HDFS ...
> >
> > i run it calling :
> > hadoop jar MyJarToWriteOnHdfs
> >
> > it will run my program and write on HDFS ..ok
> >
> > I need make this concept :
> > WebServer ---> MyJarToWriteOnHdfs  ---> HDFS
> >
> > My question is :
> > 1) I need put a  embedded Jetty in my MyJarToWriteOnHdfs  run it like a
> job
> > on Hadoop ?
> > or
> > 2) I create a webserver , put it on TOMCAT my webserver write a file in
> > hadoop ?
> >
> > thanks
> >
>
>
>
> --
> Harsh J
> www.harshj.com
>

Reply via email to