https://bugzilla.wikimedia.org/show_bug.cgi?id=46555

--- Comment #8 from Nilesh Chakraborty <nil...@nileshc.com> ---
I'm considering two options for feeding the item/property data into the
recommender:

i) Using the database-related code in the wikidata extension (I'm studying the
DataModel classes and how they interact with the database) to fetch what I need
and feed them into the recommendation engine.

ii) Not accessing the DB at all. Rather, I can write map-reduce scripts to
extract all the training data and everything I need for each Item from the
wikidatawiki data dump and feed it into the recommendation engine. I can use a
cron job to download the latest data dump when available, and run the scripts
on it. I don't think it would be an issue even if the engine lags by the
interval the dumps are generated in, since the whole recommendation thing is
all about approximations.

I personally think (ii) will be cleaner and faster. Please share your views on
this. More details on the idea can be found at :
https://www.mediawiki.org/wiki/User:Nilesh.c/Entity_Suggester

-- 
You are receiving this mail because:
You are on the CC list for the bug.
_______________________________________________
Wikibugs-l mailing list
Wikibugs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to