Here is how it works:

At Home page , theres a list of DBs , lets say

Db Name , Owner , Description
Db1         , JohnSmith, Test
Db2         , JaneSmith, Test
Db3         , JohnDoe, Test

in jqgrid.

User uploads files as an archive then clicks on Db1 and click start
processing.(note i have to use Db directly, clients saids they are easier to
manage by dbs not tables).

when start processing clicked , Db1 is posted to function named
extraction/index, there under extraction/index archive is extracted and
crawled , then resulting text are inserted into db.

db is called as below for every action it need databases:
def index():
    case_db=DAL('mysql://r...@localhost/'+ request.vars.db_name)
    case_db.define_table(file_data(...))
    crawdata=Crawler(path)
    case_db.file_data.insert(crawdata)


after extraction is done , it stores by inserting the crawled Data inside
the case_db.

On Next page , it is results function which gives data result view , from
parsed docs.

there also have to call the db on every ajax request inside jqgrid as below

def rows_ajax()
    case_db=DAL('mysql://r...@localhost/'+ request.vars.db_name)
    case_db.define_table(...)
    res=case_db(query).select()
    return (dict(res=res)


so what i want to do is instead of calling db everytime , i want to store it
inside somewhere accessible globally ..





On Sat, Aug 28, 2010 at 4:00 AM, mdipierro <[email protected]> wrote:

> Can you explain your design?
>
> On Aug 27, 1:54 pm, Phyo Arkar <[email protected]> wrote:
> > thanks massimo
> >
> > so now i try to re execute :
> >
> >
> >
> > > case_db=DAL('mysql://root@
> >
> > localhost/'+ request.vars.db_name)
> >
> > > case_db.define_table(...)
> > > session.case_db=case_db
> >
> > on every controller of every function which needs it.
> >
> > On Sat, Aug 28, 2010 at 12:56 AM, mdipierro <[email protected]>
> wrote:
> > > It is not possible because a DB contains a database connection. That
> > > cannot be serialized.
> >
> > > On Aug 27, 12:54 pm, Phyo Arkar <[email protected]> wrote:
> > > > I am trying to store dynamically generated DB into session but it
> fails
> > > with
> > > > error  . is that supported or if i want to share DB Globally , across
> > > > controller , only within a session,  but it is dynamically generated
> how
> > > > should i share without puttint into models??
> >
> > > > case_db=DAL('mysql://r...@localhost/'+ request.vars.db_name)
> > > > case_db.define_table(...)
> > > > session.case_db=case_db
> >
> > > > Traceback (most recent call last):
> > > >   File "/home/v3ss/workspace-bbb/web2py-clone/gluon/main.py", line
> > > > 411, in wsgibase
> >
> > > >     session._try_store_on_disk(request, response)
> >
> > > >   File "/home/v3ss/workspace-bbb/web2py-clone/gluon/globals.py", line
> > > > 377, in _try_store_on_disk
> >
> > > >     cPickle.dump(dict(self), response.session_file)
> >
> > > >   File "/usr/lib/python2.6/copy_reg.py", line 74, in _reduce_ex
> >
> > > >     getstate = self.__getstate__
> > > >   File "/home/v3ss/workspace-bbb/web2py-clone/gluon/sql.py", line
> > > > 1380, in __getattr__
> >
> > > >     return dict.__getitem__(self,key)
> > > > KeyError: '__getstate__'
>

Reply via email to