AW: AW: AW: updating large number of data

2006-12-06 Thread Peter Schröder
:[EMAIL PROTECTED] Gesendet: Dienstag, 5. Dezember 2006 18:23 An: cayenne-user@incubator.apache.org Betreff: Re: AW: AW: updating large number of data Yes, invalidateObjects() should do it. The trickier part is knowing when to do it and finding everything to invalidate. A good portion

updating large number of data

2006-12-05 Thread Peter Schröder
hi, we get a cvs-file with a large number of user-data every hour. and we want to replace the existing data in our database with that. is there a best-practice to do something like this? currently we are doing that with php an using mysql load data infile with the cvs-file. but i think that

Re: updating large number of data

2006-12-05 Thread Michael Gentry
Are you deleting all of the original data and then doing inserts or are you doing updates? Thanks, /dev/mrg On 12/5/06, Peter Schröder [EMAIL PROTECTED] wrote: hi, we get a cvs-file with a large number of user-data every hour. and we want to replace the existing data in our database with

AW: updating large number of data

2006-12-05 Thread Peter Schröder
-user@incubator.apache.org Betreff: Re: updating large number of data Are you deleting all of the original data and then doing inserts or are you doing updates? Thanks, /dev/mrg On 12/5/06, Peter Schröder [EMAIL PROTECTED] wrote: hi, we get a cvs-file with a large number of user-data every

AW: AW: updating large number of data

2006-12-05 Thread Peter Schröder
@incubator.apache.org Betreff: Re: AW: updating large number of data You know, sometimes simple and fast is a good way to do things. Do you have an auto-increment PK in that table? Would be helpful. As for Cayenne, can you can flush (invalidate) any active DataContexts (at least the objects