I know you are in the Solr forum here, but I'll take the chance of mentioning the new kid on the block wrt open source search engines, namely Vespa. Since your use case seems to be highly geared towards personalization, it may be worth checking it out as they seem to push Tensors and personalized results as key differentiator. It is not Lucene based and may be quite different from what you already know with ES and Solr, and to be honest I have never tested it, nor am I affiliated in any way. Here's the link: https://vespa.ai/
Jan > 13. aug. 2021 kl. 16:26 skrev Albert Dfm <[email protected]>: > > For example, for relevance ranking the usual approach is to execute a > machine learned model, e.g. using xgboost, or lightgbm. Tensorflow and > pytorch are other frameworks to build machine learning models. > While xgboost and lightgbm are ensembles of decision trees, tensorflow and > pytorch are mainly related to neutal networks. > > Elasticsearch allows to execute xgboost models for example for relevance > ranking. > The question could be applied similarly to SOLr: can we use pytorch or > tensorflow at relevance ranking phase? > > > > On Fri, Aug 13, 2021 at 4:18 PM Shawn Heisey <[email protected]> wrote: > >> On 8/13/2021 7:59 AM, Albert Dfm wrote: >>> Regarding executing models (question number 4), let me explain this a bit >>> better: >>> Can SOLr run custom tensorflow/pytorch models? This is not a feature in >>> lucene, it is something on top of it. >> >> With that info, I am even less familiar with what you're doing than I >> was before. I have no idea what either of those things are. Google >> wasn't helpful ... I probably would have to spend a week or two >> researching to even have a minimal understanding. I was able to tell >> that it's probably related to machine learning, but that's all. I have >> zero experience in that arena. >> >> It's unlikely that Solr has any direct support for those software >> programs, but if they can build queries that Solr understands, you could >> probably get something going. >> >> Thanks, >> Shawn >> >>
