Well if you are trying only to write files to Hadoop's Distributed File System, you can simply get that done with (2).
If you are planning to run MapReduce jobs (a good bunch of them), then you would need something more specific/elaborate to suit your exact needs (Either managing a set of jars, or configuring jobs live from the webapp and submitting, etc.). On Thu, Jan 20, 2011 at 9:29 PM, Alessandro Binhara <[email protected]> wrote: > Hello.. > > I start a first implementation in hadoop. > I see some example how to write on hadoop.. > > i have a question ... > I have java code to write on HDFS ... > > i run it calling : > hadoop jar MyJarToWriteOnHdfs > > it will run my program and write on HDFS ..ok > > I need make this concept : > WebServer ---> MyJarToWriteOnHdfs ---> HDFS > > My question is : > 1) I need put a embedded Jetty in my MyJarToWriteOnHdfs run it like a job > on Hadoop ? > or > 2) I create a webserver , put it on TOMCAT my webserver write a file in > hadoop ? > > thanks > -- Harsh J www.harshj.com
