I am also interested to learn about myHadoop as I use a shared storage
system and everything runs on VMs and not actual dedicated servers.

in like amazon EC2 environment which you just have VMs and huge central
storage, is it any helpful to use hadoop to distribute jobs and maybe
parallelize algorithms, or is better to go with other technologies?

2012/5/18 Manu S <manupk...@gmail.com>

> Hi All,
>
> Guess HOD could be useful existing HPC cluster with Torque scheduler which
> needs to run map-reduce jobs.
>
> Also read about *myHadoop- Hadoop on demand on traditional HPC
> resources*will support many HPC schedulers like SGE, PBS etc to over
> come the
> integration of shared-architecture(HPC) & shared-nothing
> architecture(Hadoop).
>
> Any real use case scenarios for integrating hadoop map/reduce in existing
> HPC cluster and what are the advantages of using hadoop features in HPC
> cluster?
>
> Appreciate your comments on the same.
>
> Thanks,
> Manu S
>
>
>
> On Fri, May 18, 2012 at 12:41 AM, Merto Mertek <masmer...@gmail.com>
> wrote:
>
> > If I understand it right HOD is mentioned mainly for merging existing HPC
> > clusters with hadoop and for testing purposes..
> >
> > I cannot find what is the role of Torque here (just initial nodes
> > allocation?) and which is the default scheduler of HOD ?  Probably the
> > scheduler from the hadoop distribution?
> >
> > In the doc is mentioned a MAUI scheduler, but probably if there would be
> an
> > integration with hadoop there will be any document on it..
> >
> > thanks..
> >
>

Reply via email to