pluggable functions
I see Yonik recently opened an issue in JIRA to track the addition of pluggable functions (https://issues.apache.org/jira/browse/SOLR-356). Any chance this will be implemented soon? It would save users like me from having to hack the Solr source or write custom request handlers for trivial additions (e.g., adding a distance function), not to mention changes to downstream dependencies (e.g., solr-ruby). Perhaps a reflection-based approach would do the trick? - Jon
Re: pluggable functions
On 9/18/07, Tom Hill [EMAIL PROTECTED] wrote: Hi - I'm not sure what you mean by a reflection based approach, but I've been thinking about doing this for a bit, since we needed it, too. Reflection could be used to look up and invoke the constructor with appropriately-typed arguments. If we assume only primitive types and ValueSources are used, I don't think it would be too hard to craft a drop-in replacement that works with existing implementations. In any case, the more flexible alternative would probably be to do as you're suggesting (if I understand you correctly) -- let the function handle the parsing, with a base implementation and utilities provided. The class names would be mapped to function names in the config file. - Jon I'd just thought about listing class names in the config file. The functions would probably need to extend a subclass of ValueSource which will handle argument parsing for the function, so you won't need to hard code the parsing in a VSParser subclass. I think this might simplify the existing code a bit. You might have to do a bit of reflection to instantiate the function. Did you have an alternate approach in mind? Are there any other things this would need to do? Is anyone else working on this? Tom On 9/18/07, Jon Pierce [EMAIL PROTECTED] wrote: I see Yonik recently opened an issue in JIRA to track the addition of pluggable functions (https://issues.apache.org/jira/browse/SOLR-356). Any chance this will be implemented soon? It would save users like me from having to hack the Solr source or write custom request handlers for trivial additions (e.g., adding a distance function), not to mention changes to downstream dependencies (e.g., solr-ruby). Perhaps a reflection-based approach would do the trick? - Jon
Re: custom sorting
Is the machinery in place to do this now (hook up a function query to be used in sorting)? I'm trying to figure out what's the best way to do a distance sort: custom comparator or function query. Using a custom comparator seems straightforward and reusable across both the standard and dismax handlers. But it also seems most likely to impact performance (or at least require the most work/knowledge to get right by minimizing calculations, caching, watching out for memory leaks, etc.). (Speaking of which, could anyone with more Lucene/Solr experience than I comment on the performance characteristics of the locallucene implementation mentioned on the list recently? I've taken a first look and it seems reasonable to me.) Using a function query, as Yonik suggests above, is another approach. But to get a true sort, you have to boost the original query to zero? How does this impact the results returned by the original query? Will the requirements (and boosts) of the original (now nested) query remain intact, only sorted by the function? Also, is there any way to do this with the dismax handler? Thanks, - Jon On 9/27/07, Yonik Seeley [EMAIL PROTECTED] wrote: On 9/27/07, Erik Hatcher [EMAIL PROTECTED] wrote: Using something like this, how would the custom SortComparatorSource get a parameter from the request to use in sorting calculations? perhaps hook in via function query: dist(10.4,20.2,geoloc) And either manipulate the score with that and sort by score, q=+(foo bar)^0 dist(10.4,20.2,geoloc) sort=score asc or extend solr's sorting mechanisms to allow specifying a function to sort by. sort=dist(10.4,20.2,geoloc) asc -Yonik