Just don't use Hadoop. Most of the recommender code in here is not Hadoop-based, and is for more real-time operation (though at the cost of not being able to scale past some large size). Check out the Mahout wiki for an introduction to building a recommender like this: https://cwiki.apache.org/MAHOUT/recommender-documentation.html
On Thu, Dec 30, 2010 at 11:12 AM, Alessandro Binhara <[email protected]> wrote: > ok... > > On Thu, Dec 30, 2010 at 12:45 PM, Sean Owen <[email protected]> wrote: > > > >> Can you cache a DataModel in memory across workers in a cluster? No -- >> the workers are perhaps not on the same machine, or even in the same >> datacenter. Each worker would have to load its own. >> >> Yes, i undestand it... > > >> But it sounds a bit like you are trying to have a servlet make >> recommendations in real-time by calling out to Hadoop. > > > That´s .. it... > I m looking for how to create recommendation in real-time. > > This will never work. Hadoop is a big batch-oriented framework. >> >> was understood that this operation on hadoop, like a bathc-oriented. > > > >> What you can do is pre-compute recommendations with Hadoop, as you are >> doing, and write to HDFS. Then the servlet can load recs from HDFS, >> yes. No problem there. >> >> > We have a recommendation system running on mahout here. > We thought we could build with the hadoop a realtime recommendation system > with > mahout. > I see many problems: > - how to update the data model dynamically the mahout.? > - hadoop was not build to make real-time processing. What could be used to > create a recommendations distributed system ? > > thanks for help !!!! >
