On 25/04/06, Jorge Vargas <[EMAIL PROTECTED]> wrote:
> On 4/25/06, Ed Singleton <[EMAIL PROTECTED]> wrote:
> > On 24/04/06, jvanasco <[EMAIL PROTECTED]> wrote:
> > > Thats a nightmare. backup your db on box 1, coerce your data into the
> > > new schema, and export to ascii with a dump utility. (when i do stuff
> > > like that, i keep a log of all the necessary sql commands) making
> > > filters to change data will just be a nightmare.
> >
> > I was hoping not to do all this (I know quite little of SQL), but I
> > guess you're right though.
> >
> > It does seem like there is a need to do stuff like this in TG though.
> > Altering your class definitions seems to be a lot more work than it
> > should be when you've got data in the table.
>
> The main problem with that is that there is no way to know what and how was
> change, if you change the name if a column.
> What can convince the software that a column was erase and a new one was
> created over a rename? Also if you change a relationship all your schema is
> mess up, This is not a problem with TG or python, it's a real problem of the
> relational table model, so far the only good way of doing this is with alter
> table statements and code versioning, which breaks SQLObject ;(
I was assuming I would create a dictionary that defined my changes.
Fieldname to fieldname mappings for changed fieldnames, and fieldname
to None mappings for deleted fields. When the data is pushed back in,
it looks in the dictionary of changes. If it finds a key with the
fieldname, then the value is the new name it should use, or if it's
None then it should not try to put that data back, otherwise it
assumes that the fieldnames have stayed the same.
I thought it would just be a fairly simple script that could be run in
tg-admin shell.
I've already written something similar (code below) that does that,
for moving data from a legacy database with the class def generated
from the table, to a new database that's been changed. It's a bit
rough and ready, but I thought I could improve it and adapt it for
backing up data, and then putting it back into the new tables.
The other problem I have is that when I use "fromDatabase = true",
dropping the tables with tg-admin deleted those tables. I'd love to
have a "readOnly = True" option to stop them from being accidentally
dropped (I currently have to comment them out, then do the drop, them
uncomment them).
Ed
Script:
import model
## A list of tuples. One for each pair of tables:
tables = [("newtablename","oldtablename",{"newfieldname":"oldfieldname"})]
for tablepair in tables:
try:
newTable = getattr(model, tablepair[0])
oldTable = getattr(model, tablepair[1])
print tablepair
except:
continue
fieldsmap = tablepair[2]
oldcols = oldTable.sqlmeta.columns
newcols = newTable.sqlmeta.columns
if oldTable.select().count() > 0:
for row in oldTable.select():
rowdict = {}
for newkey in newcols.iterkeys():
if fieldsmap.has_key(newkey):
oldkey = fieldsmap[newkey]
else:
oldkey = newkey
try:
rowdict[newkey]=getattr(row, oldkey)
except AttributeError:
continue
try:
temp = newTable(id=row.id)
except:
temp = newTable.get(row.id)
temp.set(**rowdict)
print temp
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups
"TurboGears" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/turbogears
-~----------~----~----~----~------~----~------~--~---