say you have:

db.define_table('paper',Field('image','upload'))
db.define_table('tag',Field('paper',db.paper),Field('keyword'))

then you will have an action like:

def index():
     paper=db.paper[request.args(0)]
     response.meta.keywords=','.join([tag.keyword for tag in
db(db.tag.paper==paper.id).select()])
     return
dict(img=IMG(_src=URL(r=request,f='download',args=paper.image)))



On Apr 26, 7:23 pm, Al <[email protected]> wrote:
> Thank you for all the comments...
> The web site is just a few hundreds of SCANNED image of verd old
> medical papers which can be searched by two database fields - Title
> and Keywords, so essentially it is just one web page with not much to
> be indexed on. There is also 'comments' people can add to each
> article, but these comments are also stored in the DB. So I must find
> a way to persist the data in these 3 searchable fields so that they
> can be crawled by the search engine, I am not sure if
> "response.meta.keyword=...." can do such job. The keyword field will
> be continuously updated - not static - so I cannot put all the
> keywords into the meta descriptions beforehand.
>
> Al
>
> --
> Subscription settings:http://groups.google.com/group/web2py/subscribe?hl=en

Reply via email to