ok...

On Thu, Dec 30, 2010 at 12:45 PM, Sean Owen <[email protected]> wrote:



> Can you cache a DataModel in memory across workers in a cluster? No --
> the workers are perhaps not on the same machine, or even in the same
> datacenter. Each worker would have to load its own.
>
> Yes, i  undestand it...


> But it sounds a bit like you are trying to have a servlet make
> recommendations in real-time by calling out to Hadoop.


That´s .. it...
I m looking for how to create recommendation in real-time.

This will never work. Hadoop is a big batch-oriented framework.
>
> was understood that this operation on hadoop, like a bathc-oriented.



> What you can do is pre-compute recommendations with Hadoop, as you are
> doing, and write to HDFS. Then the servlet can load recs from HDFS,
> yes. No problem there.
>
>
We have a recommendation system running on mahout here.
We thought we could build with the hadoop a realtime recommendation system with
mahout.
I see many problems:
- how to update the data model dynamically the mahout.?
- hadoop  was not build to make real-time processing. What could be used to
create a recommendations distributed system ?

thanks for help !!!!

Reply via email to