Karsten Hilbert wrote:
the constraints off is for insertion in no particular order, not for performanceOne can, however, I am not sure of what use they would be ? AFAIK there isn't even a performance gain because if you reenable constraints at the end of the transaction they'll just get checked at that time rather than before. Also note that AFAICT "set constraints ..." actually is some complicated fiddling with the system catalogs prone to breaking. A performance gain *could* be achieved by dropping indexes and reindexing at the very end if a lot of data is going into one and the same table.
My idea was roughly like this: identity.id = 12 ... lot's of appropriate selects are run ... the resulting dump file: set client_emcoding ... set timezone ... insert into identity () ... :new_id_identity = currval(identity_id_seq) insert into health_issue (fk_identity, ...) values (:new_id_identity, ...) ... Karsten
Attached is a plain old map making parser : the map is of the form table:id1: field : value, type
field2: value2, type2
id2:
.....
table2:
.....
I was think you can export the last part of the print output, the
printed indented
map , as a export file. The export file contains the original ids. Then you transfer the file, and a importer creates the new indexes against the second level id keys using the corresponding sequence for each table then replaces all the referenced keys in the map by using the fk_list() function's information, then creates insert statements out of the new map with the new ids. An intermediate file format means you don't have to a connection to the instance of gnumed database you are trying to export to.
schemascan.py
Description: application/python
_______________________________________________ Gnumed-devel mailing list [email protected] http://lists.gnu.org/mailman/listinfo/gnumed-devel
