Thanks that is helpful. The postgres script looks like it will work
for mysql.

Would be nice if this could be incorporated into web2py so you can use
auto_import to look at non-web2py databases from existing systems.

On Feb 25, 9:59 pm, Anthony <[email protected]> wrote:
> > db1 = DAL('sqlite://storage.sqlite', folder='c:/web2py/applications/
> > crm/databases', auto_import=True)
> > db2 = DAL('mysql://root@localhost/opencart', auto_import=True)
> > print(db1.tables)
> > print(db2.tables)
> > a=db2.executesql("show tables")
> > print(a)
>
> > This lists the tables for sqlite database; and lists the tables for
> > mysql with the "show tables" command. However it shows an empty list
> > for the mysql db.tables.
>
> auto_import works by reading the *.table files in the /databases folder of
> the web2py app that defines the models for the database in question. Has
> the MySQL database been accessed by another web2py app that explicitly
> defined models for it? If not, auto_import won't provide any information
> about what's in the database. auto_import does not inspect the database
> itself -- it is simply a way to allow one app to access the database of
> another app without having to repeat the original app's model definitions.
>
> If the MySQL database is a legacy database and you want to automatically
> build models for it by directly inspecting the database, you can try this
> script:http://code.google.com/p/web2py/source/browse/scripts/extract_mysql_m....
> There's also a newer (and I think more comprehensive) script originally
> designed to read Postgres db's, but will probably work for MySQL with
> minimal tweaking (see the
> docstring):http://code.google.com/p/web2py/source/browse/scripts/extract_pgsql_m....
>
> Anthony

Reply via email to