On 2010-06-03 09:56, Michael Kuhlmann wrote: > The only solution without "doing any custom work" would be to perform a > normal query for each suggestion. But you might get into performance > troubles with that, because suggestions are typically performed much > more often than complete searches.
Actually, that's not a bad idea - if you can trim the size of the index (either by using shingles instead of docs, or trimming the main index - LUCENE-1812) so that the index fits completely in RAM, and deploy this index in a separate JVM (to benefit from other CPUs than the one that runs your Solr core) or another machine, then I think performance would not be a big concern, and the functionality would be just what you wanted. > > The much faster solution that needs own work would be to build up a > large TreeMap with each word as the keys, and the matching terms as the > values. That would consume an awful lot of RAM... see SOLR-1316 for some measurements. -- Best regards, Andrzej Bialecki <>< ___. ___ ___ ___ _ _ __________________________________ [__ || __|__/|__||\/| Information Retrieval, Semantic Web ___|||__|| \| || | Embedded Unix, System Integration http://www.sigram.com Contact: info at sigram dot com