On Mon, May 20, 2013 at 11:46 AM, Vernon D. Cole <vernondc...@gmail.com> wrote: > when I am reformatting a string I do have to parse it.
I think we need to see a reason to reformat a string. In fact I think we need to see a reason to do any of this. I have a feeling this group is not in agreement what the over all goal is, but are debating implementations as if they all would achieve the same goal. One goal I think we can all agree on: Python bindings to the client library. I do not want to ever write this code: Py_BEGIN_ALLOW_THREADS ; conn = mysql_init(&(self->connection)); if (connect_timeout) { unsigned int timeout = connect_timeout; I want someone else to, and I want there to be one version that is the blessed version that 99% of all python developers use. In general when I run into the need for python bindings, I just want a 1:1 interface between my python code and the binary compiled library I am working with. When I don't have that, I am pretty much dead in the water. The few times I have tried to use swig of ctypes or whatever, I failed. In my dream world, here is my vision of the perfect API (which may or may not be dbapi) import foo con = foo.connect( server, credentials... ) cur = con.execute( command, parameters ) command and parameters wold get passed to the server with as little transformation as possible. I think that means the command would be passed char for char (note: unicode encoding takes place here) and the parameters get converted from the python implementation to whatever the client lib needs. At this layer, I only want to see the features of the foo server. If the foo server doesn't have the concept of transactions, then I don't want to see any transaction related code. The foo module could have a set of tests that test what it implements. Once that is implemented, foo can be wrapped to make a dbapi3_level2 compliant module. And that could could be wrapped to create a dbapi3_level3 compliant module. I hesitate to call foo dbapi3_level1 because who knows what the client lib's api look like. If we break the modules up into these 3 levels, I think it will be easier to define where features will live. For instance, I see level 3 being the place to transform tokens from a uniform "everything uses ?" to whatever the server is expecting. 3 levels will also make it way easier to fix problems, either existing bugs or future server features that bubble up into the client code and cause problems. (I am making up this future problem, I am not sure it applies to something as static as SQL servers.) Much of my motivation for the above comes from wanting to change a few lines of code in the the MySql module: query = query % tuple(( get_codec(a, self.encoders)(db, a) for a in args )) self._query(query) https://sourceforge.net/p/mysql-python/svn/HEAD/tree/trunk/MySQLdb/MySQLdb/cursors.py#l188 I can't fix that, because there is not python binding to the "pass parameters to server" thing. I think if the 2 levels were split apart I would have an easier time getting some ctypes hacker to fix the level 1, and then I wold have a chance of fixing the python code. I think what would come of this is ORM developers would use the level 2 module, and app developers who don't want to use an ORM would use the level 3 module. And everyone in this thread would be happy, because every feature described could be implemented, and features that are mutually exclusive would not step on each other. -- Carl K _______________________________________________ DB-SIG maillist - DB-SIG@python.org http://mail.python.org/mailman/listinfo/db-sig