we've had a lot of issues with Postgres and using
information_schema. whereas MySQL's information_schema support was
pointless, Postgres mostly works, but performs very slowly unless you
VACUUM your database every hour. we've had reports of primary keys
not being reflected for tables
oh, now i just understood your email.
I would say this should be added to the wiki:
http://www.sqlalchemy.org/trac/wiki/DatabaseNotes
which i have just done!
psycopg1 is no longer supported...this float issue is just one of many.
On Jun 3, 2006, at 3:42 AM, Yuan HOng wrote:
Thanks for
i had forgotten to modify the main docs that psycopg1 is not
supported, just did that.
On Jun 3, 2006, at 3:42 AM, Yuan HOng wrote:
Thanks for pointing this out. I found out that psycopg 1.1.21 gives
back a Numeric as float. When I switched to psycopg2, a Decimal object
is automatically
Rick Morrison wrote:
You can override that with an environment variable -- see
http://www.freetds.org/userguide/envvar.htm
for the details.
thanks, this worked.
___
Sqlalchemy-users mailing list
Sqlalchemy-users@lists.sourceforge.net
Rick Morrison wrote:
Also on this subject, can anyone out there report on their experiences with
mxODBC vs. Sql Server?
It would be nice to have a stable and well supported Unix-based mssql DBAPI
module for SA; I've always been a bit nervous about using FreeTDS in a
production environment.
On 6/3/06, Michael Bayer [EMAIL PROTECTED] wrote:
The Numeric types in SQLAlchemy dont modify the incoming or outgoingreturn type of bind parameters at all.
Maybe they should. Letting the DBAPI driver do what it wants for type
conversion breaks the database independence that SA purports to
yes, looking at psycopg2's source code i can see that its checking
for python 2.4 and then using Decimal:
if sys.version_info[0] = 2 and sys.version_info[1] = 4:
define_macros.append(('HAVE_DECIMAL','1'))
#ifdef HAVE_DECIMAL
PyObject *decimal = PyImport_ImportModule(decimal);
if
All,
I am working on a little bit of code that is using ActiveMapper, and
I encountered a problem. Using the following contrived example:
# BEGIN CODE #
from sqlalchemy.ext.activemapper import (ActiveMapper, column,
This code used to work, but I think a recent svn up busted it. Now using
Rev 1577
I am using sqlalchemy.mods.threadlocal
I execute the following code multiple times within a single process
invocation.
objectstore is sqlalchemy.objectstore.
If I comment out the objecstore.flush() call, then
the "create_backref()" call in the one-to-one relationship wasnt checking for a backref of None, committed a fix in 1590.On Jun 5, 2006, at 3:16 PM, Jonathan LaCour wrote: from sqlalchemy.ext.activemapper import (ActiveMapper, column, one_to_many,
This patch adds support for kinterbasdb.init() args specified in the dburi
type_conv (defaults to 200)
concurrency_level (defaults to 1)
usage would be like
dburi =
firebird://sysdba:[EMAIL PROTECTED]:someport/path/database.gdb?type_conv=200concurrency_level=1
Index:
Right, this would be a global type policy object that the Dialects
would consult to determine the appropriate conversion to make. Right
now that's done by convention, this would make things more explicit and
allow global overrides of type conversion policy.
It makes sense to leave the actual
sqlalchemy wrote:
my regard for `information_schema` gets lower every day: #60, #71.
i've proposed on the list to drop the usage of information schema for
reflection altogether
I've thought about that too. For example, the mysql-specific stuff of
ischema isn't used (commented out in
I'm getting an error that only occurs when I define my tables in a
schema. First the definitions:
from sqlalchemy import *
from sqlalchemy.types import INTEGER, VARCHAR, TEXT, DATETIME
# Define some global values.
schemaname='planreview'
metadata = DynamicMetaData(name=planreview metadata)
you can override types at the column level, even if you use reflection:table = Table('sometable', meta, Column('somecolumn', MyType), autoload = True)which will reflect every column except "somecolumn". so we have that right now.I usually see the Dialect as the "thing between the python and
is anyone aware of testing tools that we might use to evaluate the
reliability under load of SQLAlchemy + FreeTDS + pymssql on a
debian-stable platform accessing an MS SQL Server?
unfortunately i can't choose the DB, but i'm really hoping to use python
on a linux application server for our
Sorry if this is documented (I have looked). I'm using global_connect,
but I can't figure out how to save new objects.
The documentation says:
mapper(User, users_table)
# create a new User
myuser = User()
myuser.user_name =
test
___
Sqlalchemy-users mailing list
Sqlalchemy-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/sqlalchemy-users
Hi,
Seems I don't have a web proxy today, so I can't go and see tickets.
With the patch below, Table(..., autoload=True) will raise an exception
if the table name isn't found. This doesn't use .has_table(); instead,
we raise an NoSuchTableError out if the table we've autoloaded has no
I'm feeling it more on the MetaData level. You're looking at two interfaces that any SA user is going to be concerned with:the first is at the database level...those types will be necessarily fixed, by either an existing schema or by the type support or restrictions of the database itself.
the
I'm getting an error that only occurs when I define my tables in a
schema. First the definitions:
from sqlalchemy import *
from sqlalchemy.types import INTEGER, VARCHAR, TEXT, DATETIME
# Define some global values.
schemaname='planreview'
metadata = DynamicMetaData(name=planreview metadata)
this one has some pretty key fixes in it, and a pretty hefty amount
of refactoring to sessions/unit of work behavior.
important in this release is:
- global_connect() / default MetaData has been restored, in the
manner similar to 0.1's global ProxyEngine . I already noticed
some
Hello,
On Mon, 2006-06-05 at 14:39 -0400, Michael Bayer wrote:
so. does that just mean instead of sending a dictionary to
Dialect/create_engine(), we send a TypePolicy object ?
This seems like a lot of overhead for very little gain as well as not
solving the problem. In theory the DBAPI
23 matches
Mail list logo