consider this proof of concept:

db.define_table('document',Field('title'),Field('body'))
db.define_table('peer',Field('url'))

def index(): return dict(form=crud.create(db.document)

def search():
    from gluon.dal import Rows
    import xmlrpclib
    form = SQLFORM.factory(Field('keyword'))
    if form.accepts(request,session):
         rows =
db(db.document.title.contains(form.vars.keyword)).select()
         ### search at peers
         for peer in db(db.peer).select():
             records = xmlrpclib.ServerProxy(peer.url).search(keyword)
             ## merge responses
             rows = rows|Rows(db,records,rows.colnames)
    else:
         rows = None
    return dict(form=form,rows=rows)

## allow other peers to search this peer
@service.xmlrpc
def search(keywrod):
    return db(db.document.title.contains(keyword)).select().as_list()


I did not try run it and it may need some debugging. I doubt you can
do this more efficiently with Django. Yet, there are lots of
complications depending on details that need to be addressed and are
not web2py or Django specific.

Massimo

On Mar 27, 1:49 pm, NoviceSortOf <[email protected]> wrote:
> I'm curious if anybody has built a federated search engine or a front
> end for a federated search engine with Web2py?
>
> Had  conversation about this with a client today who suggested Django
> as a possible framework. My initial reaction was that Django was more
> invested in content management  than complex query mechanisms. I'm
> wondering though if Web2Py might be a better tool for the job.
>
> Please advise

Reply via email to